For the past few decades the spread and democratization of technology kickstarted by the introduction of the first personal computers in the 1980s has gradually become more and more of a defining force in our lives, with many of our lives being intrinsically infused with, and, in turn, defined by, the technology we use for our work, entertainment, and well being everyday. As these technologies have become more and more important in our lives, our lives have in turn responded to these technologies diligently, as our lifestyles grow in new directions led by the their relationships with the technology we have become so dependent on. One field that has become completely redefined by the growing relationship between technology and our lives is that of the workplace, with many of the jobs held by Americans today being defined by the modern technologies that are put to use within them. Even outside of workplace culture, we can see how technology has slipped itself into everything, even in the most random places. This is like a good thing though because the more people that can use technology equates to a higher demand for technological advancement I guess.
In the late 2000s and early 2010s, each new iPhone launch was met with an outstanding level of hype. At the time, the technology was so new and exciting that every year Apple’s smartphone would make the news with its shiny new features, with thousands of people waiting in lines outside Apple stores for the chance to get their hands on one on release day. But these days, that hype has largely died down, and while their is certainly buzz surrounding new iPhone releases, it’s nowhere near the levels it was in the time of the first iPhones. New features and designs have become predictable, often consisting of improved performance, cameras, and battery life, all notable improvements, but not innovations in the same way that the iPhone 4’s FaceTime camera or the iPhone 5s’ fingerprint sensor were. This decline in hype is what causes some tech analysts to say that smartphone innovation has plateaued, and the platform is on the edge of decline. Many see smartphones as going in the same direction that PCs went in the late 1980s and early 2000s, where performance and innovation had plateaued and customers weren’t upgrading nearly as often as they had in the past as a result. On the other hand, smartphone manufacturers are desperately trying to fend off this curve, largely through introducing gimmicky new features like modularity and three-dimensional technology, or through developing new form factors for phones, such as the ever so popular folding form. The truth is that these new features have been doing very little to generate hype around new releases, and even less to entice users to upgrade to newer models. But does this mean the smartphone is dead? Not exactly. While innovation in the smartphone space has certainly began to plateau, smartphones will largely remain relevant for the next few years if for nothing else than lack of a better alternative. Back in the early 90s, desktop PCs continued to sell despite a plateau in innovation largely because consumers didn’t have anywhere else to go for their computing needs, as laptops and portables had yet to pack sufficient power into their smaller frames, thus making them impractical for many users. The same is true now, wearables are not yet sufficiently usable to offer any real competition or threat to the well established smartphone, and until they can do so, the smartphone will be the go to mobile computer for the general population. The real issue comes in trying to hold on to customers and pushing them to upgrade, so that companies developing products for the post-smartphone future have enough income to do so. One avenue that is growing in popularity is the subscription/reoccurring revenue approach, which sees customers paying a certain amount over a period of time for a phone they don’t own but essentially lease until a newer model comes out, where the process continues. This software as a service and automotive industry inspired approach is popular among companies like Apple as it demands less innovation and guarantees a steady stream of income, while still allowing users to get a shiny new device each year, even if it isn’t drastically different under the surface from their last. Another way that companies achieve the same result is by charging more for their phones, as people tend to hold onto them for longer periods of time, thus allowing companies to make the same amount of money off of customers as before, just in a shorter period of time. One final approach is the much maligned practice of artificially slowing down older phones, as to push users to newer and seemingly faster devices. While this practice gets a bad rep, it does have some benefits, like pushing users to newer devices with better user experiences they may not have experienced without said push. To conclude, the smartphone has definitely slowed down in terms of innovation, but it’s far from dead, as there is nothing currently that can offer any sort of real alternative to it, and companies like Apple and Samsung will continue to push new devices to users in new and unique ways, with success, until this isn’t the case.
When Steve Jobs iconically pulled the first ever MacBook Air out of an envelope way back in 2008, few could foresee the impact that the laptop would have on the computer industry as a whole. Now, over a decade later, it is easy to see how the then unbelievably thin and light profile of the device heavily informed almost all laptops that came after it, with many of the design elements that made the original MacBook Air’s profile so iconic still present in a majority of laptops on sale today. But what many fail to realize is that that original 2008 MacBook Air was largely seen as a failure, a rarity for Apple. It was seen as such for a number of reasons, mainly, lack of ports, a lacking display, and, most importantly, comparatively terrible performance when put up against both Apple’s own notebooks and those from other manufacturers. All of these issues were exacerbated by a steep price of $1,799, a price made almost entirely unjustifiable given the aforementioned flaws that came with the device. It wasn’t until the second generation Air, introduced two years later, that these flaws were fixed and the MacBook Air became the mainstay of coffee shops and college lectures that it is known so well for today. This anecdote is important to the subject of the Apple Silicon in Macs today as Apple took a similar approach to the one they took with the original 2008 MacBook Air more recently with another Product, and they were met with strikingly familiar results. In 2015, Apple announced the brand new MacBook, sans “pro” or “air”, it was just the MacBook. With the MacBook, Apple ushered in a brand new design language and ethos to the Mac, one that emphasized portability and versatility to their popular line of laptops and desktops, and promised to change the way laptops were designed in the same way the original MacBook Air did. Spoiler alert: it didn’t. However, where the original MacBook Air and the 2015 MacBook were similar were in the numerous problems that plagued their respective releases. Much like the original 2008 MacBook Air, the 2015 MacBook was seen as lacking in connectivity department, too underpowered, and once again, far too expensive for what it was. All of these issues culminated in Apple pulling the MacBook from sale after just 4 years in 2019, replacing it with a redesigned MacBook Air. But this doesn’t need to be the end for MacBook. The issues that came up with the redesigned MacBook echoed through the rest of Apple’s Notebook lineup from 2015 up until very recently, as those flaws resulted from the scale of form and function leaning to far towards the former. But, with the recent announcement that Apple will begin to use their own processors in their Mac lineup, the most pressing of these issues could be solved. Apple’s custom silicon offers several key advantages for the Mac lineup. First off, is improved power efficiency. Apple’s custom processors are based off of the open ARM standard, which, as a RISC (reduced instruction set computer) processor architecture, promises much more power per watt when compared to Intel’s processors, meaning the issues with power that faced the 2015 MacBook and the later Macs that followed in its design foot steps could be eliminated. Furthermore, using in house processors offers a significant cost cutting advantage for Apple, as it allows for them to reduce the overhead generated from using processors from firms like Intel and AMD. This could answer the concerns over the ever-raising prices and questionable value of Apple’s Mac’s by allowing Apple to reduce prices on the lower end of their lineup and pack in more value for your dollar on the higher end. To conclude, Apple’s transition to implementing their own processors in their Macs offers them a unique opportunity to solve the issues facing the product line for years.
I think that one of America’s biggest problems right now comes directly from the way it views itself. Nationalism is more visible in our country than it has been in a long while, perhaps even since world war II. While I believe that national pride is by all means a good thing, like with all things, too much of it can become hurtful. Today, I think people see America as more of an object, whereas we should be looking at it as a group of people. This is where we derive the idea of disrespecting it when we kneel during the national anthem, or when we say that one party is more patriotic than the other. We’ve made America an object and in doing so, we’ve attached these intrinsic values to it, values like what we call patriotism. I feel that these values that we attach to this objectified idea of America don’t do anything to bring the country together, and instead have the opposite effect. This approach is no different than identity politics, it’s a product of harsh overgeneralization. Instead of viewing America as an object represented by a flag that can be”disrespected” we should look at America as what it really is, a group of people from radically different backgrounds. Instead of seeing the flag as something that can be disrespected, we should look at how we disrespect each other, as this is what really matters. No one gets directly hurt when someone kneels during the National Anthem, sure it may hurt our pride, but at the end of the day we’re still breathing, right? But when we disrespect each other, we not only hurt another person, we hurt the country as a whole as we destroy the bonds that hold it together. We need to stop objectifying America as some kind of symbolic icon, and we need to start personifying each other and seeing each other as what we really are: human beings.
When Tim Berners-Lee first “invented” the internet, he foresaw its primary use as an avenue for the rapid and worldwide spread of information. And since its inception, the internet has primarily fulfilled this goal, serving as a medium for thousands of professors, writers, bloggers, and journalists to share their work with the world. Perhaps the most powerful (in terms of potential for societal disruption, that is) trait of the internet spreads from this: the democratization of higher learning. Almost all of the largest universities and colleges, both private and public, offer online learning programs for interested students, most of the time doing so free of charge. What makes this so powerful is potential for the spread of higher education to those who would not typically be able to access it. The internet enables people like minorities, who might not typically be able to afford a college education to access all the benefits of one, eliminating a massive roadblock to their culture’s advancement in society. There’s just one problem: time. While these courses are mainly free of charge, they still require massive amounts of time and dedication to complete and fully absorb the wisdom they entail. For a while, this expense of time has served as a barrier that in the place of price for people like minorities, who wouldn’t typically have the time to dedicate themselves to these courses, and thus, would be unable to partake in them. But now, with the current crisis facing the world, we, as a race, have more free time than ever before, and the same way the internet was able to eliminate the cost barriers for people to access these aforementioned courses, the Covid-19 situation has, if indirectly, eliminated the barrier of time commitment. And this elimination of the final barrier in the democratization of higher education is the spark to the fire that is the new digital enlightenment. I foresee a large number of previously oppressed people taking advantage of their newfound wealth of time and spending it on taking advantage of what these wonderful courses have to offer, many of which serving as paths to new careers. Furthermore, I believe that, through taking advantage of these courses, these oppressed peoples will be able to socially advance to a far greater degree than ever before, with their well established girt and determination for success in combination with their newly acquired valuable career knowledge opening doors that were previously closed to them, finally allowing them to achieve self enlightenment. This internet-enabled digital enlightenment, while not immediately visible now in this clouded time of conflict and distress, will be felt for decades to come, with previously oppressed people using what the world has given to them to socially advance and make everyone’s lives better. So, the digital enlightenment has begun, and though you may not be able to see it now, you certainly will be able to in the future.
As the Cover-19 situation continues to escalate, more and more Americans are working from home, performing tasks previously thought to be impossible to outside of the office environment. And as more American businesses transition to remote working, more and more of their employees find how much more they enjoy it over their traditional working methods. On the other end of these businesses, administrators are reaping the benefits of having a remote workforce as well, with the increases in productivity, available working time, and employee satisfaction that remote working brings. And as these businesses see how effective distance working can be, they are faced with the proposition of allowing their employees to work remotely even after this pandemic ends. But this would raise the question, if advancements in society and technology would afford America’s workforce to work from anywhere in the country to work jobs that previously required working in an office, whose to say any one in the wold couldn’t take those positions? Certainly outsourcing workforce by hiring employees from other countries would most likely be more cost effective than hiring Americans, and countries like China, India, and Russia have hundreds of thousands if not millions of candidates for American jobs. Remote working offers an exciting opportunity in the business world through this democratization of American employment, allowing thousands of eligible workers the opportunity to take jobs at American firms, an opportunity they lacked before solely due to where they live. However, this idea also presents a dangerous precedent for the preexisting American workforce, as the ability for companies to hire workers with equal or even greater skill than that found in American workers for a fraction of the cost would almost definitely mean that a large percentage of American workers would lose their jobs to foreign workers working remotely. So do we need a “digital border wall” of sorts? Do we need some form of protection for the American workforce? Hell no. This type of opportunity is exactly what makes technology so great, it affords people the ability to do things they previously weren’t enabled to do, it breaks boundaries, it disrupts, and we shouldn’t stand in the way of innovation, revolution, and democratization for the sake of protecting American jobs.
It’s easy to see that Apple is a company whose products sell themselves as much based on design as they do on functionality. Design is one of the most important traits that has Apple harnessed to distinguish themselves from the competition, a strategy that can be traced all the way back to the iMac, the colorful, fun design of which stood in stark contrast to the monotonous and dull beige and black boxes of its contemporary competitors. But today, it would seem that Apple relies on design less than ever, as many of its products have designs similar if not identical to those of their competitors. While this isn’t completely Apple’s fault, with many of their competitors intentionally making their products similar in design to Apple’s, they certainly don’t seem to be doing much about, instead choosing to stick to a well-established design language that gets more and more boring and less and less special with every new product release. This idea of Apple losing their design touch was exacerbated with the announcement of Jony Ive’s departure over the summer, which, in the eyes of many, was the final nail in the coffin for Apple’s winning streak of innovative and unique design. But while it would seem that, when taking these aforementioned facts into consideration, Apple’s design is on the decline, that isn’t the full story. Sure, Apple’s industrial design may be loosing its luster, but that doesn’t mean that its design overall is degrading, in fact, you could even argue that its getting better. You see, it’s not just the way Apple products look that influences people to purchase, but it’s also, to an equal or even greater extent, the way they work and feel. And while Apple’s design may becoming less unique, its user experience design is certainly the opposite. Apple’s design teams have made strides in creating amazing experiences for and between their products, such as those made with AirPods, which require the minimum amount of thought and effort to connect to your iPhone or iPad, or the new Magic Keyboard, which effortlessly transforms the iPad Pro into a desktop-grade computer. On the whole, Apple’s overall approach to design hasn’t gotten any worse, its just changed, adapted with new consumer patterns, Apple doesn’t need their products to look the best to sell them anymore, most of the people who care about that kind of thing already have iPhones, and Apple just needs to make it harder for them to leave by designing the best user experiences within their ecosystem, and for the people who don’t have iPhones, they see their design and integration between hardware and software as good enough to entice them to come over to their ecosystem. But while this approach works for now, it won’t always, and good enough wont always cut it, as more and more companies adopt the ecosystem approach pioneered by Apple, their integrated user experiences will get better, and, most dangerously for Apple, they will be able to offer them at a more competitive price. So while for now, Apple’s design and approach to it seems to be working out, that wont always be the case, and there will always be a need for thoughtful, innovative design over at 1 infinite loop.
(While most of Apple’s staff has migrated over to Apple Park, the hardware design team, for the most part, still resides at the old campus at 1 infinite loop, most likely to keep important, secretive product designs from being seen by the wrong eyes)
When Steve Jobs returned to Apple during the late 1990s, he used one skill to single handily take Apple from one of the world’s most disastrous companies to one of it’s most popular and powerful ones. That skill was focus, and for a company like Apple that completely lacked it before Jobs brought it with him upon his return, focus meant the difference between rising to become one of the greatest companies in the world and falling into obscurity. But now, Steve Jobs is gone, and since his death, Apple has grown exponentially, and Apple’s focus has proportionally expanded just as fast as its market cap, but that isn’t necessarily a good thing. These days its easy to see how Apple has become less and less focused, with the trillion dollar tech firm expanding into dozens of new categories every year, including finance, entertainment services, health, and possibly even the automotive industry. And while this may initially seem like a good thing, in reality, that really isn’t the full story, as each time Apple expands into a new market, they divert more and more focus into said market, taking away valuable focus and concentration on the markets they are already in. This is most visible when it comes to their software, such as their various operating systems and the design of the applications within them. The past few versions of these OS updates have been notoriously bug heavy, and the apps within them have become noticeably less user friendly. And while this can neither be confirmed or denied, this visible decline in software user experience quality and design can very easily be diagnosed as a byproduct of a lack of focus, with Apple’s presence in each of the markets they occupy taking away vital focus from each other. To conclude, Apple is losing focus, most likely because they are trying to do to many things at once. this is dangerous because Apple has survived and thrived on being a company that provides great products with the some of the best possible user experiences, which they’ve been enabled to do by extreme focus. Everything from software and hardware design to packaging and marketing, Apple’s extreme focus has given them the ability to refine even the most minute details, allowing them to provide users with amazing and top of the line experiences. But if Apple should lose this focus that has been so important to them in the past, then the Apple that changed the world so many years ago might as well have had died along with Steve Jobs.
10 years ago, what we now know today as the iPad was the most hyped tech product of the 2000s. Apple’s then heavily rumored and highly anticipated tablet was made out to be a revolutionary new product that would utilize sate of the art technologies to provide a device with a user experience so new and unique that it would forever change the way we used computers. The iPad held the promise of transforming the then-bland personal computer the same way that the iPhone had transformed the then-bland mobile phone just a few years earlier. Then it came out. The easiest way to describe the general public’s initial response to Steve Job’s grand unveiling of the iPad is to say that the amount of disappointment that coincided with said announcement was equal, if not even greater than, the amount of hype that led up to it. The iPad was simply so overhyped, and really, misunderstood, that when people finally got to see it, they saw an oversized iPhone, and not a revolutionary new computing device. Of course, as the iPad continues to advance as a product, the latter of these two descriptions becomes more and more visible, but that doesn’t help the fact that the public’s initial response to the iPad was a relatively underwhelmed one. And where this applies to the rumors and speculation surrounding Apple’s potential AR product is in the hype that can be seen in those rumors. Like with the iPad, Apple’s AR glasses- if they even exist- have been rumored for years, and as time goes by, those rumors, and more importantly the hype that comes with them, continue to grow. Through this comparison its easy to see how similar the time leading up to the announcement and release of Apple’s AR product is to that of the iPad’s. Furthermore, this similarity found between these two devices will likely continue to develop once whatever AR device that Apple is rumored to be working on finally gets released. To conclude, like with the iPad, Apple’s AR glasses won’t change the world, and they won’t change the way the world makes computer, but, by peeling away just a few more layers at most of complexity, they will, if gradually, change the way we use them.
As different forms of artificial intelligence emerge and grow, concerns over their effect on society have grown just as quickly as hopes for their applications have. The idea that robots or some form of AI will one day take your jobs is rapidly become a more and more tangible one, as tech firms, both big and small, make massive advancements in the fields of machine learning and artificial intelligence. As these advancements continue to be made, different forms of AI become more and more sufficient at performing the same tasks that our professions demand, and soon enough, they will be sufficient enough to replace us in these professions, with a lower cost of operation to justify such a replacement. But this doesn’t have to be bad thing. One must simply loom to history for evidence of this claim. In ancient Athens, prior to the introduction of slavery there, Athenian life was quite analogous to our’s today. The general population, for the most part, worked menial positions and had a decidedly poor quality of life, working simply to get by with very little time for enjoyment. However, after slavery was introduced, which I am in no way saying was a good thing, It became inefficient and ineffective for the general population to continue doing said menial tasks, as the slave population could work much harder for a simple entry fee, instead of the wages that came with unenslaved workers. A distinct parallel can be drawn here between ancient Athens and the modern world, as, like with slaves, an artificial intelligence powered workforce wouldn’t demand the wages the modern work force does, instead requiring a simple entry price. So what happened to the general population once slaves took their jobs? Well, they came up with new ones. They developed new jobs and trades that afforded them far a far better quality of life, and these new trades and the ideas that developed with them spurred a golden age of intellectual enlightenment in Athens, one where some of the greatest ideas, inventions, and thinkers were born out of. AI can and probably will spur a similar age of enlightenment upon its popularization. What’s more, it won’t require the suppression of living beings like the introduction of slaves did, giving it all the benefits with none of the drawbacks. So instead of thinking about robots inevitably taking your current job, think about what you will do with your newfound sense of freedom, how will you take advantage of the next age of enlightenment?