Jony Ive made the news last summer when the long time Apple chief designer, responsible for giving a personality to many of the company’s most important products, would be leaving the company. The world was shocked; Ive had been a monumentally important force in the company, with many considering his role to be close to equal in significance to the Tim Cook’s, the CEO of the company. While he’s only been gone for just over two months, Ive’s absence has quickly become noticeable, especially in Apple’s design. Newer products like the new Mac Pro, MacBook Pro, or even the new iPhone serve as shining examples of a clear lack of Jony Ive’s design policy of form over function, a policy that carried Apple to success from earlier failures and to where it is today. These aforementioned devices are thicker, larger, and more functional than before, all while being less simplistic. A great example of this is the iPhone, which received a plethora of online ostracism for it’s controversial camera design, which, while allowing for greater photographic potential, stuck out like a sore thumb when compared to earlier designs. So, is this the future of Apple’s design: functionality over form? Tim Cook’s history as a successful industrial engineer, a position that exists solely to cut costs and add functionality with little consideration for the beauty and appearance of a product, would certainly add evidence to this theory. However, Design plays a massive role in Apple, often setting it apart from its competitors, so, how will a decreased emphasis on it effect the company? To answer that, we will have to wait and see.
In the tech world, a successful transition is a scarcity. Flops like Windows Vista, the Windows 8, and well, pretty much any version of Windows after ’95 just serve to prove this point. This curse, however, seemingly does not afflict Apple. Throughout its history, the company has pulled off numerous successful transitions, such as the transition from MacOS 9 to MacOS X, even further back, the transition from the Apple I to the Apple II, and, most relevant to this topic, the monumental architecture transition from PowerPC to Intel. The Intel transition should will down in history as a master class on product transition; In just one year, Apple successfully, and, more importantly, gracefully, moved the whole entire Mac lineup from the PowerPC Processor architecture to Intel’s, all while seeing widespread adoption after doing so. This monumental transition does not get nearly as much attention as it deserves, and even when compared to everything the company had done before and has done since, it still goes down as one of the most impressive feats in the company’s history. But now, another, equally as massive transition could be coming, and if the rumors are correct, the Mac could be moving to ARM. The ARM architecture, off of which Apple’s iPhones, iPads, and Watches are powered, has several key advantages over Intel’s, which is currently in use in all of Apple’s Macs. For one, they are significantly more efficient, with their lower power consumption leading for what can equate to up to double the battery life on Intel computers. Furthermore, ARM processors support native cellular radios, which, with the advent of 5G, could prove tremendously useful. Finally, specifically for Apple, the company already produces them for use in their phones, watches, tablets, and more. This could make Macs running ARM chips significantly cheaper than preexisting Intel versions, as in house processor manufacturing would severely undercut the cost of production. While all of these prospects could serve as reasons why Apple could make an ARM based Mac, they’re not why Apple should make one. In fact, the reason why Apple should make an ARM based Mac doesn’t even pertain to the Mac at all, it pertains to the iPad. Apple sees the iPad as the future of the personal computer, but it’s pretty clear that not everyone else does. While the company has made strides to bring the iPad closer to their vision for it, such as a dedicated, first party hardware keyboard, or more recently, iPadOS, with its range of previously Mac/PC exclusive abilities, it is still clear that many people don’t see the iPad as a computer. A major contributing factor to this sentiment stems from the lack of apps, many of which are deemed essential to a computing experience. This includes apps like Lightroom, Photoshop, and even Apple’s own Final Cut, all apps that people’s entire careers are based off of that have no equivalent offering on the iPad. The reason for this comes from a lack of developer support, as devs don’t see the iPad as a viable platform for their applications given the time and cost needed to develop apps for it. A large majority of this cost comes from having to port or translate PC or Mac apps that were designed to run on Intel processors to iOS apps that were designed to run on ARM processors. Porting an app from Mac or PC to iPad or iPhone is no easy task, and developers simply don’t see it as a worthwhile one given that the large majority of computer users reside on the Mac because that’s where the apps are -It’s a catch twenty-two. However an ARM Mac could break this loop. Releasing a Mac based off of the Arm architecture would force developers to develop versions of their apps for it, or risk losing relevancy for failing to do so. Once they developed ARM ports for the Mac, a large hurdle in porting to the iPad would be removed, and doing so would be an exponentially easier pill to swallow. This-with to all the advantages of the ARM architecture on top of it- is why Apple should put out an ARM Mac, not for the future of the Mac, but for the future of the iPad.
These days, you seem to hear a lot about Apple’s recent lack of innovation. While this is a debatable topic, its not hard to hide that many of the pieces of technology that we have today owe at least some element of themselves to an innovation made by Apple. Everyday essentials like our phones, our watches, our music, movies, tv shows, books, our classrooms, offices, and living rooms would not be the same without the great technological strides, nay leaps, made by Apple. But after over 40 years, what is Apple’s greatest contribution to the world? Some might say the obvious answer would be the Mac. It revolutionized the graphical user interface and allowed computers to become ubiquitous, infiltrating every aspect of our daily lives. Others might say Apple’s greatest innovation is the iPhone, a device that made computers more portable than ever before, by putting them in our pockets, disrupting the entire computer industry in the process. Others still might say the iPad, as it’s advent brought with it the first truly great tablet experience. But I don’t think any of these products, or the innovations that they introduced, no, in fact, I don’t think that Apple’s greatest innovations come from additions in their products at all, but rather from omissions in their products. What I mean by this is, Apple hasn’t benefited the industry most by making bold new additions to their products, but by taking things away, and streamlining them. One great example of this comes in the form of the omission of the disk drive on the Mac. If Apple had continued to ship built-in SuperDrives with the Mac, we would have continued to use a superfluous technology for years rather than moving on to more advanced and better technologies, like internet downloads rather than those done with a disc. Furthermore, the removal of arrow keys on the original Mac. The Mac was the first computer to ship with a mouse, which, at the time of its release, was a new and untested technology, with an unsure future. The lack of arrow keys pushed users to use the mouse, where they would of otherwise used a more comfortable alternative in the arrow keys. Without this push, it is impossible to tell whether or not the mouse would of caught on, and if it didn’t, the way we interact with technology today would be exponentially different. Apple’s killing of the headphone jack, premature as it might have been, pushed users to a better user experience, and made AirPods and other wireless earbuds a ubiquitous technology, the same thing the omission of the arrow keys did for mouse. This innovation by omission can also be seen in Apple’s trademark design as well. Apple’s simple design language is what makes attractive. Apple’s design, both in hardware and software form, is simple and attractive by omission, it only takes what it needs. This is a good overall message for Apple’s innovation policy, less is more, simplification is innovation, and all of Apple’s greatest contributions stem from this.
Recently, its gotten easy to wrap your head around the idea of the iPad as a computer. Through Apple’s addition of features like split screen multitasking, a file manager, and mouse support, the iPad is more like a computer than ever before. But it can be so much more. While these new additions are great, they take away some of the fundamental advantages that a tablet has over a traditional laptop computer. Tablets can me much more versatile, portable, and intimate computers than laptops can, yet these recent updates essentially turn the iPad into a touchscreen Mac. The prospect of the iPad that allowed it to garner so much hype before its launch was that the possibility of a whole new, completely different form factor for computing, one that would allow for truly endless possibilities and applications. Instead of pursuing this future for the iPad, Apple seems intent on pursuing an easier to use Mac, which isn’t what people want, when they should be pursuing a wholly different device, one that does things that the Mac can’t by harnessing its unique hardware, software, and form factor.
Ever since the passing of Steve Jobs, the idea that Apple is not what it used to be has grown and grown. This sentiment has been amplified in recent years, as made clear by countless articles, YouTube videos, and Podcast rants that all seem to say that Apple is nothing compared to what it was under Jobs. While I personally believe that Apple, in recent years, has demonstrated a downward trend in terms of the number of their innovations, I do not think that has everything to do with the lack of Steve Jobs. I believe that Jobs was a key player in realizing and making ubiquitous Apple’s innovations, but I don’t think that his absence is the prime reason Apple’s innovations have been seemingly few and far between in recent years. I think that innovation as a whole has plateaued recently, as a result of a transition in the technology landscape. Most of Apple’s (and Silicon Valley’s as a whole) innovations of the past two or three decades revolved around making tech more personal and accessible. The sole purpose of the original Mac’s game-changing graphical user interface was to widen the accessibility of the personal computer, so that even a child could understand it. The iPhone put a computer in all of our pockets, virtually erasing the need for a computer to do most tasks, and forcing it to adopt new features to remain a viable product, while making computers more trendy and omnipresent than ever before. But now, computers are ubiquitous. They aren’t scary, massive boxes like they used to be. They’re stylish, popular, and they’re everywhere. Everyone can use them, from 4 year olds to senior citizens, computers can’t be much more easier to access and to use. What once was a push for innovation is now an upper limit that stalls it. But this doesn’t mean that Apple will never innovate ever again. New technologies, like AR haven’t yet been able to infiltrate the consumer landscape, and it’s up to innovative companies like Apple to push them to do so. Technology is always advancing, and we will always need companies like Apple to make its newest evolutions available to us.
I typically don’t support open source technologies. They are typically unpolished, inferior products that lack good user experiences and have unfocused visions and purposes. However, in the case of virtual assistants, I think the field could benefit greatly from an open source offering. The problems with most virtual assistants come from their privacy concerns, like in the case of Amazon or Google, or inferior databases, like in the case of Apple’s Siri. An open source virtual assistant could curb these setbacks by allowing supporters to cater to the assistants database and do so according to their own will. This would allow both users and manufacturers to develop apps for the assistant and integrate their products directly with it, specializing the features the assistant would offer when used with their products, while also allowing users to create a more personalized, user friendly personal assistant that could cater to their specific needs in ways that other assistants couldn’t. Eric Von Hippel, an expert on user innovation emphasizes the benefit of user innovators’ (users creating and contributing to something for their own gain) ability to make products that function for them, whereas manufacturers and companies are forced to take a one size fits all approach to maximize reach and market share. I think that, while a one size fits all approach to products is a good one in many situations, in a situation where the product is supposed to be tailored for the user, a broader data contribution range and information database that is donated voluntarily provided by users could lead to unprecedented success in creating a more user friendly and capable virtual assistant.
My intent on Writing ACDW was to highlight some of the problems we are currently facing with the internet, and discuss deeper questions surrounding them. I found I have fulfilled my purpose in said experiment, and would like to see how the posed questions are answered by time.
One key prospect of the internet, upon its arrival, was the advent that it didn’t belong to anyone, but to everyone. However now, many are concerned that the biggest players in the centralized internet (social media) now own the web, and if you wish to gain any audience, you must sing in their hall. So, is the future of the internet centralized? Will there be a decentralized revolution? If there is, what does that mean for the user experience?
With the advent of the internet and the virtual world, scarcity is one aspect from the physical dimension many had hoped to leave behind. But alas, scarcity has meandered its way into the digital realm, and introduced another barrier into what could be an endless plane of possibility. It started out as Digital Rights Management, to ensure fair compensation for content, but quickly exploded into crypto currency, essentially the most sugar coated form of manufactured scarcity. So the question is, will scarcity put a cap the limitless possibility of the internet, or will it tear it open further?