A lot of times the company that tries to be the most revolutionary in the field of producing its respective product ends up being the least successful. This struggle of great concepts failing as products is prevalent in the field of electronics, where new innovations that could change the way we use technology come around incredibly frequently. However, even though these innovative new technologies could push a shift in our usage patterns of technology, the often don’t, typically following a cycle starting with a brief hype period after the showcase of a new technology , succeeded by a handful of products from hopeful manufacturers, followed by a brief period of profitability and then inevitable market failure, concluding in a halt in manufacturing and a dismissal of the initial idea as a success due to a justified lack of confidence. One example of this cycle in play that you can witness at this very moment is that of the folding phone. The technology garnered tremendous hype and speculation, but when Samsung was the first company to finally bring it to consumers, their product flopped, and all companies that followed Samsung have as well. But why is this? Why do “revolutionary products” often fail even if at heart they are great ideas? The answer is simple, one word simple: execution. The success of any product, innovative or iterative, relies not on the initial idea but on the final execution. An idea is meaningless, what you can do with it is priceless. In the case of the folding phone, the idea is great, but every execution of it to date has been met with shear failure. Companies haven’t spent the time and effort to make a great product, instead pushing often faulty, more expensive devices with inferior user experiences to those already widely accessible. It really comes down to putting in the dedication to make something great, and if it can’t be done, if the technology isn’t ready yet, then don’t do it. Innovation isn’t just thinking of a great idea, it’s turning that idea into an equally as great product, and if you can do that, you can take over the world.
Whenever developing a new product in the field of consumer electronics, or any field for that matter, one inevitably encounters the following the debate: should I make my product the best it can be, or the best it can be for cheap? Those last two words are the sole reason this conundrum exists, and they have been the central point of countless meetings, conference calls, and college lectures since the American industrial revolution. However, there is something fundamentally wrong about this debate, and in fact, the debate itself shouldn’t even exist. This battle between the best possible user experience and the most accessible price tag stems from an idea of two different facets of accessibility: accessibility through affordability, and accessibility through ease of use. Seemingly, these two accessibilities are parallel and completely incompatible, at least the countless companies that have faced this struggle would have you believe it to be that way. Typically companies approach choosing one of these accessibilities to focus on by determining what approach would better suit their product, usually coming to a crude conclusion by asking themselves a question along the lines of: do I want my product to find itself in the hands of a ton of somewhat content users, or a fewer number of happier ones? If the is the former choice, than the better choice would theoretically be to create a cheaper product, one more accessible to a wider range of users due to a price more of them would be able and willing to pay. On the other hand, if the answer to that question is the latter choice, then theoretically the better route to take would be creating a more refined and, as a result, more costly product. However, the reality that product developers who follow this strategy fail to pick up on is that you don’t need to sacrifice one of these accessibilities in favor of the other, you don’t have to spite your nose to save your face. You don’t need to spend exponentially more to get a better, more refined product, you just need the right people. While yes, one avenue of acquiring great employees is through enticing them with flashy benefits and bigger paychecks, both requiring increased expenses, you can easily reach the same result with a great product. If you feel that you have a great product, and you can convince others that you do, then the people who are working on that product will pour their heart and soul into it. This is another misconception in the field of product development that is appalling to me: the notion that the most important people to sell your product to are the press, when this is far from the truth and really, the most important group of people to convince are your employees. If you truly see greatness in your product, and you can convince people that are working on it of its greatness too, you’ll never need stock options, a kombucha fridge in your office, or one month of paid vacation days, if you can convince the right people of your product’s greatness, they’ll put everything into ensuring it’s success, and you won’t have to spend an extra dime, then and only then, you can sell the world on your product, and make it truly accessible, in all definitions of the word.
No. In fact, the computer will replace the smartphone. This is thanks to the superior productivity and creativity applications available on the computer, that require the computer’s (whether it be a tablet, laptop, or desktop) specific form factor to be used effectively. The phone isn’t the evolution of the personal computer, it’s the extension of it. It’s meant to perform the tasks that it was initially designed to perform, namely light web browsing, messaging, and social applications. The introduction of AR will one day negate the need for a phone, and it will impossible for it to do so without the PC, likely using the PC to harness processing power not feasibly available in a chassis small enough to be a pair of glasses. So no, the phone will not replace the PC, and instead the reverse will happen.
Whenever a new Apple product comes out, your’e sure to hear everything about it for days. If its a phone, you hear about the fancy new camera, for AirPods, the noise cancelling, for the MacBook, the new keyboards, but for all of these products, the thing you seem to hear about the most is always the price. Apple is a premium company, and their name isn’t exactly synonymous with cheap. But many think that Apple’s products, especially in recent years, aren’t just expensive, but overpriced. It’s not hard to see why some people have this sentiment, two hundred and fifty dollars for a pair pf headphones or one thousand for an iPhone isn’t what you would call cheap, but calling these products overpriced is another story. The detail that many Apple naysayers omit when deeming the company’s products overpriced is that when you buy an iPhone, MacBook, or AirPods, your’e not just paying for the phone, laptop, or headphones, your’e paying for the whole user experience. Apple products hold with them the guarantee of a user experience that is a cut above the rest, and one that carries with it a higher price tag. This elevated user experiences is pricy for a company to develop, with countless hours, dollars, and assets poured their designs, research, and developments. This extra effort has allowed for user experience breakthroughs such as the seamless integration between AirPods and iOS devices, or even farther back with the original Mac and it’s graphical user interface. So when you read that it costs Apple two hundred and fifty dollars to assemble a phone they charge one thousand for, your’e not seeing the whole picture. It’s not just the assembly, its the seamless nature of the software that it runs off of, the research behind it’s innovative new features, and the breathtaking design that embodies the device. All of these factors and the processes, time, and people behind them make Apple’s devices more costly to develop then their competitors, but they’re also what makes them better, and they truly make and Apple product and Apple product.
Jony Ive made the news last summer when the long time Apple chief designer, responsible for giving a personality to many of the company’s most important products, would be leaving the company. The world was shocked; Ive had been a monumentally important force in the company, with many considering his role to be close to equal in significance to the Tim Cook’s, the CEO of the company. While he’s only been gone for just over two months, Ive’s absence has quickly become noticeable, especially in Apple’s design. Newer products like the new Mac Pro, MacBook Pro, or even the new iPhone serve as shining examples of a clear lack of Jony Ive’s design policy of form over function, a policy that carried Apple to success from earlier failures and to where it is today. These aforementioned devices are thicker, larger, and more functional than before, all while being less simplistic. A great example of this is the iPhone, which received a plethora of online ostracism for it’s controversial camera design, which, while allowing for greater photographic potential, stuck out like a sore thumb when compared to earlier designs. So, is this the future of Apple’s design: functionality over form? Tim Cook’s history as a successful industrial engineer, a position that exists solely to cut costs and add functionality with little consideration for the beauty and appearance of a product, would certainly add evidence to this theory. However, Design plays a massive role in Apple, often setting it apart from its competitors, so, how will a decreased emphasis on it effect the company? To answer that, we will have to wait and see.
In the tech world, a successful transition is a scarcity. Flops like Windows Vista, the Windows 8, and well, pretty much any version of Windows after ’95 just serve to prove this point. This curse, however, seemingly does not afflict Apple. Throughout its history, the company has pulled off numerous successful transitions, such as the transition from MacOS 9 to MacOS X, even further back, the transition from the Apple I to the Apple II, and, most relevant to this topic, the monumental architecture transition from PowerPC to Intel. The Intel transition should will down in history as a master class on product transition; In just one year, Apple successfully, and, more importantly, gracefully, moved the whole entire Mac lineup from the PowerPC Processor architecture to Intel’s, all while seeing widespread adoption after doing so. This monumental transition does not get nearly as much attention as it deserves, and even when compared to everything the company had done before and has done since, it still goes down as one of the most impressive feats in the company’s history. But now, another, equally as massive transition could be coming, and if the rumors are correct, the Mac could be moving to ARM. The ARM architecture, off of which Apple’s iPhones, iPads, and Watches are powered, has several key advantages over Intel’s, which is currently in use in all of Apple’s Macs. For one, they are significantly more efficient, with their lower power consumption leading for what can equate to up to double the battery life on Intel computers. Furthermore, ARM processors support native cellular radios, which, with the advent of 5G, could prove tremendously useful. Finally, specifically for Apple, the company already produces them for use in their phones, watches, tablets, and more. This could make Macs running ARM chips significantly cheaper than preexisting Intel versions, as in house processor manufacturing would severely undercut the cost of production. While all of these prospects could serve as reasons why Apple could make an ARM based Mac, they’re not why Apple should make one. In fact, the reason why Apple should make an ARM based Mac doesn’t even pertain to the Mac at all, it pertains to the iPad. Apple sees the iPad as the future of the personal computer, but it’s pretty clear that not everyone else does. While the company has made strides to bring the iPad closer to their vision for it, such as a dedicated, first party hardware keyboard, or more recently, iPadOS, with its range of previously Mac/PC exclusive abilities, it is still clear that many people don’t see the iPad as a computer. A major contributing factor to this sentiment stems from the lack of apps, many of which are deemed essential to a computing experience. This includes apps like Lightroom, Photoshop, and even Apple’s own Final Cut, all apps that people’s entire careers are based off of that have no equivalent offering on the iPad. The reason for this comes from a lack of developer support, as devs don’t see the iPad as a viable platform for their applications given the time and cost needed to develop apps for it. A large majority of this cost comes from having to port or translate PC or Mac apps that were designed to run on Intel processors to iOS apps that were designed to run on ARM processors. Porting an app from Mac or PC to iPad or iPhone is no easy task, and developers simply don’t see it as a worthwhile one given that the large majority of computer users reside on the Mac because that’s where the apps are -It’s a catch twenty-two. However an ARM Mac could break this loop. Releasing a Mac based off of the Arm architecture would force developers to develop versions of their apps for it, or risk losing relevancy for failing to do so. Once they developed ARM ports for the Mac, a large hurdle in porting to the iPad would be removed, and doing so would be an exponentially easier pill to swallow. This-with to all the advantages of the ARM architecture on top of it- is why Apple should put out an ARM Mac, not for the future of the Mac, but for the future of the iPad.
These days, you seem to hear a lot about Apple’s recent lack of innovation. While this is a debatable topic, its not hard to hide that many of the pieces of technology that we have today owe at least some element of themselves to an innovation made by Apple. Everyday essentials like our phones, our watches, our music, movies, tv shows, books, our classrooms, offices, and living rooms would not be the same without the great technological strides, nay leaps, made by Apple. But after over 40 years, what is Apple’s greatest contribution to the world? Some might say the obvious answer would be the Mac. It revolutionized the graphical user interface and allowed computers to become ubiquitous, infiltrating every aspect of our daily lives. Others might say Apple’s greatest innovation is the iPhone, a device that made computers more portable than ever before, by putting them in our pockets, disrupting the entire computer industry in the process. Others still might say the iPad, as it’s advent brought with it the first truly great tablet experience. But I don’t think any of these products, or the innovations that they introduced, no, in fact, I don’t think that Apple’s greatest innovations come from additions in their products at all, but rather from omissions in their products. What I mean by this is, Apple hasn’t benefited the industry most by making bold new additions to their products, but by taking things away, and streamlining them. One great example of this comes in the form of the omission of the disk drive on the Mac. If Apple had continued to ship built-in SuperDrives with the Mac, we would have continued to use a superfluous technology for years rather than moving on to more advanced and better technologies, like internet downloads rather than those done with a disc. Furthermore, the removal of arrow keys on the original Mac. The Mac was the first computer to ship with a mouse, which, at the time of its release, was a new and untested technology, with an unsure future. The lack of arrow keys pushed users to use the mouse, where they would of otherwise used a more comfortable alternative in the arrow keys. Without this push, it is impossible to tell whether or not the mouse would of caught on, and if it didn’t, the way we interact with technology today would be exponentially different. Apple’s killing of the headphone jack, premature as it might have been, pushed users to a better user experience, and made AirPods and other wireless earbuds a ubiquitous technology, the same thing the omission of the arrow keys did for mouse. This innovation by omission can also be seen in Apple’s trademark design as well. Apple’s simple design language is what makes attractive. Apple’s design, both in hardware and software form, is simple and attractive by omission, it only takes what it needs. This is a good overall message for Apple’s innovation policy, less is more, simplification is innovation, and all of Apple’s greatest contributions stem from this.
Recently, its gotten easy to wrap your head around the idea of the iPad as a computer. Through Apple’s addition of features like split screen multitasking, a file manager, and mouse support, the iPad is more like a computer than ever before. But it can be so much more. While these new additions are great, they take away some of the fundamental advantages that a tablet has over a traditional laptop computer. Tablets can me much more versatile, portable, and intimate computers than laptops can, yet these recent updates essentially turn the iPad into a touchscreen Mac. The prospect of the iPad that allowed it to garner so much hype before its launch was that the possibility of a whole new, completely different form factor for computing, one that would allow for truly endless possibilities and applications. Instead of pursuing this future for the iPad, Apple seems intent on pursuing an easier to use Mac, which isn’t what people want, when they should be pursuing a wholly different device, one that does things that the Mac can’t by harnessing its unique hardware, software, and form factor.
Ever since the passing of Steve Jobs, the idea that Apple is not what it used to be has grown and grown. This sentiment has been amplified in recent years, as made clear by countless articles, YouTube videos, and Podcast rants that all seem to say that Apple is nothing compared to what it was under Jobs. While I personally believe that Apple, in recent years, has demonstrated a downward trend in terms of the number of their innovations, I do not think that has everything to do with the lack of Steve Jobs. I believe that Jobs was a key player in realizing and making ubiquitous Apple’s innovations, but I don’t think that his absence is the prime reason Apple’s innovations have been seemingly few and far between in recent years. I think that innovation as a whole has plateaued recently, as a result of a transition in the technology landscape. Most of Apple’s (and Silicon Valley’s as a whole) innovations of the past two or three decades revolved around making tech more personal and accessible. The sole purpose of the original Mac’s game-changing graphical user interface was to widen the accessibility of the personal computer, so that even a child could understand it. The iPhone put a computer in all of our pockets, virtually erasing the need for a computer to do most tasks, and forcing it to adopt new features to remain a viable product, while making computers more trendy and omnipresent than ever before. But now, computers are ubiquitous. They aren’t scary, massive boxes like they used to be. They’re stylish, popular, and they’re everywhere. Everyone can use them, from 4 year olds to senior citizens, computers can’t be much more easier to access and to use. What once was a push for innovation is now an upper limit that stalls it. But this doesn’t mean that Apple will never innovate ever again. New technologies, like AR haven’t yet been able to infiltrate the consumer landscape, and it’s up to innovative companies like Apple to push them to do so. Technology is always advancing, and we will always need companies like Apple to make its newest evolutions available to us.