When Tim Berners-Lee first “invented” the internet, he foresaw its primary use as an avenue for the rapid and worldwide spread of information. And since its inception, the internet has primarily fulfilled this goal, serving as a medium for thousands of professors, writers, bloggers, and journalists to share their work with the world. Perhaps the most powerful (in terms of potential for societal disruption, that is) trait of the internet spreads from this: the democratization of higher learning. Almost all of the largest universities and colleges, both private and public, offer online learning programs for interested students, most of the time doing so free of charge. What makes this so powerful is potential for the spread of higher education to those who would not typically be able to access it. The internet enables people like minorities, who might not typically be able to afford a college education to access all the benefits of one, eliminating a massive roadblock to their culture’s advancement in society. There’s just one problem: time. While these courses are mainly free of charge, they still require massive amounts of time and dedication to complete and fully absorb the wisdom they entail. For a while, this expense of time has served as a barrier that in the place of price for people like minorities, who wouldn’t typically have the time to dedicate themselves to these courses, and thus, would be unable to partake in them. But now, with the current crisis facing the world, we, as a race, have more free time than ever before, and the same way the internet was able to eliminate the cost barriers for people to access these aforementioned courses, the Covid-19 situation has, if indirectly, eliminated the barrier of time commitment. And this elimination of the final barrier in the democratization of higher education is the spark to the fire that is the new digital enlightenment. I foresee a large number of previously oppressed people taking advantage of their newfound wealth of time and spending it on taking advantage of what these wonderful courses have to offer, many of which serving as paths to new careers. Furthermore, I believe that, through taking advantage of these courses, these oppressed peoples will be able to socially advance to a far greater degree than ever before, with their well established girt and determination for success in combination with their newly acquired valuable career knowledge opening doors that were previously closed to them, finally allowing them to achieve self enlightenment. This internet-enabled digital enlightenment, while not immediately visible now in this clouded time of conflict and distress, will be felt for decades to come, with previously oppressed people using what the world has given to them to socially advance and make everyone’s lives better. So, the digital enlightenment has begun, and though you may not be able to see it now, you certainly will be able to in the future.
As the Cover-19 situation continues to escalate, more and more Americans are working from home, performing tasks previously thought to be impossible to outside of the office environment. And as more American businesses transition to remote working, more and more of their employees find how much more they enjoy it over their traditional working methods. On the other end of these businesses, administrators are reaping the benefits of having a remote workforce as well, with the increases in productivity, available working time, and employee satisfaction that remote working brings. And as these businesses see how effective distance working can be, they are faced with the proposition of allowing their employees to work remotely even after this pandemic ends. But this would raise the question, if advancements in society and technology would afford America’s workforce to work from anywhere in the country to work jobs that previously required working in an office, whose to say any one in the wold couldn’t take those positions? Certainly outsourcing workforce by hiring employees from other countries would most likely be more cost effective than hiring Americans, and countries like China, India, and Russia have hundreds of thousands if not millions of candidates for American jobs. Remote working offers an exciting opportunity in the business world through this democratization of American employment, allowing thousands of eligible workers the opportunity to take jobs at American firms, an opportunity they lacked before solely due to where they live. However, this idea also presents a dangerous precedent for the preexisting American workforce, as the ability for companies to hire workers with equal or even greater skill than that found in American workers for a fraction of the cost would almost definitely mean that a large percentage of American workers would lose their jobs to foreign workers working remotely. So do we need a “digital border wall” of sorts? Do we need some form of protection for the American workforce? Hell no. This type of opportunity is exactly what makes technology so great, it affords people the ability to do things they previously weren’t enabled to do, it breaks boundaries, it disrupts, and we shouldn’t stand in the way of innovation, revolution, and democratization for the sake of protecting American jobs.
It’s easy to see that Apple is a company whose products sell themselves as much based on design as they do on functionality. Design is one of the most important traits that has Apple harnessed to distinguish themselves from the competition, a strategy that can be traced all the way back to the iMac, the colorful, fun design of which stood in stark contrast to the monotonous and dull beige and black boxes of its contemporary competitors. But today, it would seem that Apple relies on design less than ever, as many of its products have designs similar if not identical to those of their competitors. While this isn’t completely Apple’s fault, with many of their competitors intentionally making their products similar in design to Apple’s, they certainly don’t seem to be doing much about, instead choosing to stick to a well-established design language that gets more and more boring and less and less special with every new product release. This idea of Apple losing their design touch was exacerbated with the announcement of Jony Ive’s departure over the summer, which, in the eyes of many, was the final nail in the coffin for Apple’s winning streak of innovative and unique design. But while it would seem that, when taking these aforementioned facts into consideration, Apple’s design is on the decline, that isn’t the full story. Sure, Apple’s industrial design may be loosing its luster, but that doesn’t mean that its design overall is degrading, in fact, you could even argue that its getting better. You see, it’s not just the way Apple products look that influences people to purchase, but it’s also, to an equal or even greater extent, the way they work and feel. And while Apple’s design may becoming less unique, its user experience design is certainly the opposite. Apple’s design teams have made strides in creating amazing experiences for and between their products, such as those made with AirPods, which require the minimum amount of thought and effort to connect to your iPhone or iPad, or the new Magic Keyboard, which effortlessly transforms the iPad Pro into a desktop-grade computer. On the whole, Apple’s overall approach to design hasn’t gotten any worse, its just changed, adapted with new consumer patterns, Apple doesn’t need their products to look the best to sell them anymore, most of the people who care about that kind of thing already have iPhones, and Apple just needs to make it harder for them to leave by designing the best user experiences within their ecosystem, and for the people who don’t have iPhones, they see their design and integration between hardware and software as good enough to entice them to come over to their ecosystem. But while this approach works for now, it won’t always, and good enough wont always cut it, as more and more companies adopt the ecosystem approach pioneered by Apple, their integrated user experiences will get better, and, most dangerously for Apple, they will be able to offer them at a more competitive price. So while for now, Apple’s design and approach to it seems to be working out, that wont always be the case, and there will always be a need for thoughtful, innovative design over at 1 infinite loop.
(While most of Apple’s staff has migrated over to Apple Park, the hardware design team, for the most part, still resides at the old campus at 1 infinite loop, most likely to keep important, secretive product designs from being seen by the wrong eyes)
When Steve Jobs returned to Apple during the late 1990s, he used one skill to single handily take Apple from one of the world’s most disastrous companies to one of it’s most popular and powerful ones. That skill was focus, and for a company like Apple that completely lacked it before Jobs brought it with him upon his return, focus meant the difference between rising to become one of the greatest companies in the world and falling into obscurity. But now, Steve Jobs is gone, and since his death, Apple has grown exponentially, and Apple’s focus has proportionally expanded just as fast as its market cap, but that isn’t necessarily a good thing. These days its easy to see how Apple has become less and less focused, with the trillion dollar tech firm expanding into dozens of new categories every year, including finance, entertainment services, health, and possibly even the automotive industry. And while this may initially seem like a good thing, in reality, that really isn’t the full story, as each time Apple expands into a new market, they divert more and more focus into said market, taking away valuable focus and concentration on the markets they are already in. This is most visible when it comes to their software, such as their various operating systems and the design of the applications within them. The past few versions of these OS updates have been notoriously bug heavy, and the apps within them have become noticeably less user friendly. And while this can neither be confirmed or denied, this visible decline in software user experience quality and design can very easily be diagnosed as a byproduct of a lack of focus, with Apple’s presence in each of the markets they occupy taking away vital focus from each other. To conclude, Apple is losing focus, most likely because they are trying to do to many things at once. this is dangerous because Apple has survived and thrived on being a company that provides great products with the some of the best possible user experiences, which they’ve been enabled to do by extreme focus. Everything from software and hardware design to packaging and marketing, Apple’s extreme focus has given them the ability to refine even the most minute details, allowing them to provide users with amazing and top of the line experiences. But if Apple should lose this focus that has been so important to them in the past, then the Apple that changed the world so many years ago might as well have had died along with Steve Jobs.
10 years ago, what we now know today as the iPad was the most hyped tech product of the 2000s. Apple’s then heavily rumored and highly anticipated tablet was made out to be a revolutionary new product that would utilize sate of the art technologies to provide a device with a user experience so new and unique that it would forever change the way we used computers. The iPad held the promise of transforming the then-bland personal computer the same way that the iPhone had transformed the then-bland mobile phone just a few years earlier. Then it came out. The easiest way to describe the general public’s initial response to Steve Job’s grand unveiling of the iPad is to say that the amount of disappointment that coincided with said announcement was equal, if not even greater than, the amount of hype that led up to it. The iPad was simply so overhyped, and really, misunderstood, that when people finally got to see it, they saw an oversized iPhone, and not a revolutionary new computing device. Of course, as the iPad continues to advance as a product, the latter of these two descriptions becomes more and more visible, but that doesn’t help the fact that the public’s initial response to the iPad was a relatively underwhelmed one. And where this applies to the rumors and speculation surrounding Apple’s potential AR product is in the hype that can be seen in those rumors. Like with the iPad, Apple’s AR glasses- if they even exist- have been rumored for years, and as time goes by, those rumors, and more importantly the hype that comes with them, continue to grow. Through this comparison its easy to see how similar the time leading up to the announcement and release of Apple’s AR product is to that of the iPad’s. Furthermore, this similarity found between these two devices will likely continue to develop once whatever AR device that Apple is rumored to be working on finally gets released. To conclude, like with the iPad, Apple’s AR glasses won’t change the world, and they won’t change the way the world makes computer, but, by peeling away just a few more layers at most of complexity, they will, if gradually, change the way we use them.
As different forms of artificial intelligence emerge and grow, concerns over their effect on society have grown just as quickly as hopes for their applications have. The idea that robots or some form of AI will one day take your jobs is rapidly become a more and more tangible one, as tech firms, both big and small, make massive advancements in the fields of machine learning and artificial intelligence. As these advancements continue to be made, different forms of AI become more and more sufficient at performing the same tasks that our professions demand, and soon enough, they will be sufficient enough to replace us in these professions, with a lower cost of operation to justify such a replacement. But this doesn’t have to be bad thing. One must simply loom to history for evidence of this claim. In ancient Athens, prior to the introduction of slavery there, Athenian life was quite analogous to our’s today. The general population, for the most part, worked menial positions and had a decidedly poor quality of life, working simply to get by with very little time for enjoyment. However, after slavery was introduced, which I am in no way saying was a good thing, It became inefficient and ineffective for the general population to continue doing said menial tasks, as the slave population could work much harder for a simple entry fee, instead of the wages that came with unenslaved workers. A distinct parallel can be drawn here between ancient Athens and the modern world, as, like with slaves, an artificial intelligence powered workforce wouldn’t demand the wages the modern work force does, instead requiring a simple entry price. So what happened to the general population once slaves took their jobs? Well, they came up with new ones. They developed new jobs and trades that afforded them far a far better quality of life, and these new trades and the ideas that developed with them spurred a golden age of intellectual enlightenment in Athens, one where some of the greatest ideas, inventions, and thinkers were born out of. AI can and probably will spur a similar age of enlightenment upon its popularization. What’s more, it won’t require the suppression of living beings like the introduction of slaves did, giving it all the benefits with none of the drawbacks. So instead of thinking about robots inevitably taking your current job, think about what you will do with your newfound sense of freedom, how will you take advantage of the next age of enlightenment?
Last month, Apple was on a role when it comes to new product releases. The MacBook Air, iPad Pro, Mac mini, and more all saw new, upgraded releases across last month, all of them met with the usual fanfare that coincides with releases of new Apple products. But out of all of these releases, two clearly resonated the most with people: the new iPhone SE, and a set of wheels for the Mac Pro. While at first these two products would seem like they have nothing to do with each other, when you take a closer look, the similarity between the reasons for the hype around their releases is apparent. That similarity is value, more specifically, the inherent value either of these products provide for their price, which one must understand to fully grasp this topic. The new iPhone SE made waves for being the first truly affordable-to-the-mass-market iPhone in years, with a low $400 asking price that will net users a device with the same processor as the $1000 2019 iPhone 11 Pro in the dated but still well designed form factor of 2017’s iPhone 8, which the SE supersedes in the iPhone lineup. On the other hand, the brand new set of wheels that Apple released for their Mac Pro cost an almost unbelievable $699, a price tag that understandably shocked the world upon its announcement. So what is Apple’s game here? Are they trying to covey a message of affordability with the release of a $400 iPhone, or are they trying to position themselves as a luxury brand by releasing a set of $700 wheels for their already expensive $5000 professional computer. Well, the answer is both. Tim Cook understand the value of branding, and he knows that positioning Apple as a company that provides high quality products with matching high quality user experiences is key to retaining Apple’s image. But he also knows that, while these products need to be priced to be accessible to as many people as possible, pricing every product that way could dilute the image of Apple as a high quality brand. So, Apple has to maintain this balancing act of releasing affordable and accessible products that cater to more frugal customers, while also releasing commodity-priced products, as to uphold the image of the Apple brand. To do this, Apple typically attaches these commodity or luxury prices to non-essential products, like accessories, products that aren’t needed for a great user experience, but certainly benefit it when owned. Such products include Apple’s $250 Airpods Pro, $350 Magic Keyboard, and, in this case, their $700 wheels. So, to conclude, Tim Cook doesn’t want Apple to look like a luxury brand, such Ferrari or Lamborghini, but they don’t want to appear as a budget brand either, such as a Ford or Toyota, they want to appear as a high quality brand that offers products with high but worthwhile prices, and to do this they sell products that both allow frugal customers to have a good user experience at a price they can swallow, while still offering luxury priced products that fewer people will or will need to buy, to uphold their brand image.
Focus. Any Silicon Valley guru will tell you it’s the most important trait a tech company can have, and it’s true. In a world where tech can be applied to virtual anything, focusing on one sub-sector of it is often the key to success for a tech company, all the way from the startup to monopoly level. For the last few years, however, some of the biggest tech firms in the world have seemingly begun to ignore this fact, most notably: Google and Amazon. Over the past decade, these companies have strived to build ecosystems around their products and platforms, with the intent of locking users into their platforms and making it difficult for them to leave. Apple employs a similar practice, which can be seen in products such as their AirPods, which have seamless integration with the rest of their products, which in turn have seamless integration with each other. However, where companies like Google and Amazon are different than those like Apple is in the services and products they use to enforce their own ecosystems. While all of Apple’s ecosystem-reinforcing products seem to make sense and fit into said ecosystems nicely and clearly, Amazon and Google’s do so to a far lesser extent. For example, one way Amazon builds up its ecosystem is by building products for both enterprise and consumer markets, but these products don’t have clear lines that can be drawn to one another, and instead loosely fit into an overall ecosystem that is built out of much smaller ecosystems. This all has to do with the fact that, in this day and age, monopolies are essentially legal, and companies like Amazon and Google take advantage of this, spreading into as many different markets as they can. And they can afford to do this too, as even if they loose money in one category, they can subsidize those losses with the gains from their more successful products and categories, giving them a substantial advantage over smaller firms. But just as this practice gives companies like Amazon and Google a key advantage over smaller firms, it also means that they have a substantial weakness when compared to them as well, they can’t focus on that one subcategory as smaller firms can. And as companies like Amazon and Google grow and spread into more and more markets, their focus becomes more and more diluted, until they can’t apply sufficient focus to any of the markets they occupy, allowing smaller firms that can to beat them out and take over, slowly eating away at these monopolies until there is nothing left but a stumbling husk carrying the name of a company that once was synonymous with domination, domination that was destroyed by that company’s own lack of focus.
5 years ago, Google and Amazon were touting the digital assistant as the future of consumer electronics. The promise of a smart assistant deeply integrated into your home seemed enticing, with countless possibilities and a more simplified user interface, it seemed that the digital assistant could become a truly massive new platform, up there with the likes of those built by the smart phone and personal computer. But 5 years on, it’s not difficult to see that Google and Amazon’s dream for the digital assistant hasn’t really taken off. While digital assistants have certainly become more popular, they haven’t become much more powerful, and due to this we’re largely using them for the same things we were 5 years ago, that is: asking what the weather is and queuing Spotify playlists. But it doesn’t have to be this way, digital assistants can be truly useful and powerful tools, tools that could greatly simplify our lives and the technology they are integrated with. But to accomplish such a level of usefulness would require more than the efforts of just Amazon, Google, and Apple, it would require a new digital assistant “operating system” of sorts, with different developers integrating their own digital assistants as applications within those “operating systems”. This approach would allow digital assistants to have much deeper and tighter integration with third party applications and services, while still giving the companies behind the digital assistants control over their respective platforms. Digital assistants aren’t going to get any better by their companies efforts alone, and they will only reach their true potential with the combined effort of thousands of devlopers working for hundreds of companies.
As the computer has become more and more ubiquitous over the past few decades, program has become more and more popular with it. This popularization is immediately visible, too, we’ve seen school after school invest in computer science programs, dozens of apps that revolve around making programming easier and more understandable have flooded app stores across all platforms. Needless to say, programming is more popular than ever, but should it become so popular that everyone should learn it? Should we put as much of an emphasis on learning a programming language as we do learning a primary language, such as English? Well, not exactly. While programming is certainly extremely important to the advancement of technology, that doesn’t mean that everyone should learn it. Learning programming is more like learning an instrument or even trade than another language. That’s because learning a programming language is typically done for the sake of making something with the language you’ve learned. When you learn a world language, you typically do so to communicate. When you learn a programming language, you do so to make something. The type of work that goes into learning a programming language, while certainly homologous to that of a world language, is much more analogous to learning an instrument. I guess what I’m trying to say is that programming requires a specific mindset, the way an instrument does, and that means that we shouldn’t try to force it on everyone, but we should certainly offer it. On the other hand, the computer science classes that more and more schools are developing are certainly important, if not for teaching computer skills, than for teaching computer literacy, which is getting more and more important as technology becomes more and more intertwined with our everyday lives.