Why Apple is Failing at Services.

If you ask almost anyone in Silicon Valley, they’ll tell you that services are the future of the tech industry, and it’s not hard to see that they truly believe that. Subscription services are everywhere today, they’re behind the way we watch tv, listen to music, edit photos, and even do our work. Tons of companies have found financial success through this business model, as it practically guarantees recurring revenue, the thing that investors want to see most, but despite this, Apple continues to see diminishing returns and subcscriber numbers with their own services. But why is this? How come one of the biggest tech companies in the world that has built an empire off of making great products failing where countless others have found great amounts of success? Well, the answer is simple. To understand why Apple is failing at services, one first must understand why they have succeeded in all of their other ventures. From the very beginning, Apple has succeeded by doing things differently. They were willing to take risks in the name of innovation and furthering the user experience, risks that allowed them to find success in innovative new products like the iPod, iPhone, and iPad. But when in comes to services, Apple’s don’t stand out all that much when compared to say Netflix’s, Spotify’s, or Hulu’s. They don’t offer any improvement on these services, only iterations, iterations that are frankly worse than the services they are iterating on. If Apple wants to succeed at services, they have to make Apple services, not HBO with an Apple skin on it, or Spotify with Siri support and a genuinely terrible user interface, they need to make the iPod’s and iPad’s of services, innovative new products that offer better user experiences than and genuine improvements on their competitors’ offerings. To put it simply, Apple needs to think different.

Is Microsoft the Next IBM?

In the 1980s, IBM was the undisputed king of the personal computer space. Although they had a late start, the IBM PC quickly gained market dominance over competitors like Apple and RadioShack, landing on the desks and in the homes of thousands of people both in America and around the world. But today, IBM doesn’t make personal computers anymore, and they haven’t for nearly fifteen years. Today, IBM is known for their enterprise products, mainly selling mainframes and servers to businesses, with little to no consumer facing business practices, and less of a household name than ever before. In the 2020s, Microsoft is the undisputed king of the personal computer space, at least in terms of users, that is. With a market share of over 75% globally, they have near complete control over the desktop and laptop comuter industry, with their closest competitor, Apple, having a meager 10% market share in comparison. But recently, Microsoft has put more and more focus on their enterprise products, including Azure, a cloud computing platform much like those offered by Google, Amazon, and, of course, IBM. This shift is eerily similar to that of IBM’s in the early 2000s, with both seeing primarily consumer facing companies completely pivot and become business facing ones. IBM was forced to do this, as their operating system was vastly inferior to and lacked the user base of Microsoft’s Windows, as they had failed to hop on the train of graphical user interfaces on time. This is similar the difficulty Microsoft faces now, having failed in the mobile computing space with the Windows phone, which came to market far to late to pose any sort of threat to the well established iOS and Android platforms. If Microsoft continues to fail to keep up with the curve, they will be forced to rely on their most profitable products, their enterprise products, and if they do that then, like IBM, they will fade from public view, and eventually become just more than a footnote in the annals of technology history.

5 Years On, was the Apple Watch a Success?

The Apple Watch was the first big new product line for Apple since the death of Steve Jobs, and, needless to say, it had a lot riding on it. Leading up to its announcement, speculation around the Apple Watch ran wild, with many heralding it to be the next iPhone. But now, 5 years later, has the Apple Watch lived up to those lofty expectations? Well, yes, for Apple it has at least. When the Apple Watch was finally unveiled, it was met with some deal of disappointment. Some of this disappointment was understandable, as the Apple Watch could never live up to the insane amounts of hype surrounding it, but, on the other hand, some of this disappointment was definitely warranted. Ever since its release, the Apple Watch has felt more like an iPhone accessory than a stand alone device, with it falling closer to Apple’s AirPods than to its iPhone in terms of functionality and impact. But this doesn’t mean that the Apple Watch has been a letdown by any stretch of the imagination, it’s just not what we expected it to be. The Apple Watch is part of a broader technological future, wearables, where our computing is relegated into smaller, simplified computers that we interact with in more subliminal ways. The Apple Watch was never meant to be a stand alone device, it just wouldn’t work as one, instead, it is meant as supplement to preexisting devices, one that simplifies and advances the user experiences of those preexisting devices overall, most notably in the way in negates the need to use those other devices for the simpler tasks that the Apple Watch can complete. Like I said, the Apple Watch is a device for the future, where our computing needs are divided and spread out across multiple computing platforms, platforms like AR headsets, wireless earbuds, and, currently, smart watches like the Apple Watch. While this future isn’t quite here yet, for now, the Apple Watch offers an excellent companion to the iPhone for its health and fitness capabilities, messaging and calling applications, and more simplified tasks that negate the need to look at your iPhone.

We Need Theranos Now More Than Ever Before

Perhaps the most famous story of failure in Silicon Valley is that of Theranos. A unicorn company poised to take over the valley with billions of dollars worth of runway and massive amounts of hype backing it, it seemed the that health tech firm’s success was inevitable. But today, the name Elizabeth Holmes, and as an extension, Theranos, is not synonymous with success, instead, it is synonymous with fraud and corruption. But its times like these where the firms failure hurts the most. Theranos hoped to bring widespread instant blood testing to the market, which, in a global pandemic like this, would have made testing for viruses and diseases significantly less of a struggle.

The Greatest Overlooked Potential for AI

Out of all up and coming technologies, AI has-almost indisputably-the greatest amount of hype surrounding it. This hype largely stems from the fact that AI’s true potential is relatively unknown, making its theoretical potential limitless. While this certainly does make it difficult to accurately gauge the true potential of artificial intelligence, some of its theoretical applications are incredibly intriguing and have true potential. People have found ways to cram artificial intelligence into practically ever use case, which might make you think that almost all practical use cases have been covered, but in reality, this isn’t the case. Perhaps the greatest use case for AI has not been discovered or in any case-realized-as of yet, and in my opinion, this overlooked use case is more accessible programming. Over the past few decades, as computing has become more and more ubiquitous, programming has failed to become noticeably easier to adopt with it. While coding has certainly become more user-friendly in certain aspects, it largely remains too difficult to be accessible by the masses. But AI could change this. AI and, as an extension, machine learning could help to make programing drastically easier and more accessible to the general population. By allowing for simpler commands and scripting by using machine learning algorithms and pattern recognition, machine learning could allow for more versatile syntax for programming languages, meaning coding would be significantly easier to adopt. But why does this untapped application have so much potential? It’s simple, more and more people would be able to realize their ideas if they had the ability to program them into reality, and when we can all realize our dreams, the world becomes a better place.

What Does the Future Hold for the Mac?

Over the past ten years, Apple has slowly but surely built up a case for the iPad being a replacement for the traditional computer, and as an extension of that, the Mac. But even today, Apple continues to release new and improved Mac models, with even more rumored to be on the horizon. So, in a future where almost everyone uses an iPad for their daily computing needs, where does the Mac stand? The best answer seems to be as a device to do the list of things the iPad can’t do, which, thanks to rapid innovation, is constantly growing smaller and smaller in size. The remaining contents of this list are becoming more and more made up of edge cases, but there are a few that may never be able to truly work on the iPad. One such case that comes to mind is enterprise use. Because of the iPad’s hallmark portability, enterprise server use is one application that the iPad will most likely never be able to handle to the same extent as the Mac does. Another use case is app and web development, which typically requires machines more powerful than those developed applications are running on in order to maximize said applications performance. However, besides these two use cases, which have relatively low user bases, the iPad should soon be able to do most of the most important things the Mac can, negating the need for it almost entirely.

The Mac Was the iPad of the 80s.

Today, the Mac and the graphical user interface that it introduced are ubiquitous, they’re seen in coffeeshops, classrooms, and airports all around the world, but for it wasn’t always that way. Back in 1984, when the original Macintosh was unveiled, it was met with about as much skepticism as it was with excitement, with many unsure if the computer’s headlining new graphical based operating system would take off, and at first, it didn’t. It took years for the Mac to catch on, but eventually, customers and competitors saw the genius in Apple’s design, and slowly but surely, the graphical user interface took over the computing world. Today, another one of Apple’s product lines is undergoing a similar story to that of the Mac’s, and that is the iPad. When Steve Jobs announced the iPad a decade ago, it was met with a familiar mix of hype and skepticism. Ten years on, the iPad and its touch based navigation is slowly taking the world by storm, and before we know it, the iPad will be as prominent as the Mac.

Why the iPad mini Sucks

Over the past few years, Apple has mades strides to turn the public’s perception of the iPad from a entertainment device to a computer replacement, but one product stands in their way: the iPad mini. The iPad mini epitomizes everything that is wrong with tablets, with the key reason for its existence being entertainment applications. The mini’s smaller screen restricts the type of work that can be done on it to a far to extreme extent, making it way to impractical to act as a stand in for a fully fledged laptop the way its larger brothers can, relegating it to a glorified larger phone, and its this restriction that is so dangerous to the iPad’s adoption and evolution. I would be fine with this, the mini’s existence wouldn’t bother me at all if it weren’t so dangerous to the rest of the iPad lineup. As I said before, the advancements and changes Apple has made with the iPad lineup are largely negated by the mini’s existence, which helps to retain the general public view that the iPad works best as an entertainment device, and not a next-generation computer.

What Will True Consumer Ready Augmented Reality Really Look Like?

For years, the subject augmented reality has been relegated solely to the stuff of science fiction, but finally, after years of seeing it in books, movies, and tv shows, true consumer augmented reality may soon come to fruition in our reality. For the past few years, more and more big tech firms such as Google, Facebook, and Apple have been hopping on the AR train, but it seems like each of these firms have different ideas on AR’s applications and what it can really be. Social media companies like Facebook and Snapchat are developing AR for entertainment and social uses, fitting in line with the services they provide. More business facing firms such as Google and Microsoft are pushing their AR products for enterprise use, whereas Apple, long rumored to be developing an AR headset, is seemingly developing their AR platform primarily for consumer use, with some speculating said platform could evolve into a product with as big of an impact on the tech landscape as the original iPhone. But which one this varied visions will AR fulfill? The answer is all of them, and none of them at the same time. If AR does have as big of an effect on the tech landscape as the smartphone did, and judging by the plethora of players in it, the chances of such are high, then AR will not be defined by these applications, and instead, it will define new applications. When the iPhone came out, it didn’t disrupt the smartphone space, it disrupted the personal computer space, and it redefined many computer applications, such as communication and entertainment. AR will do the same thing, rather than being restricted by preexisting applications, AR will create new ones, ones that will be informed by wide range of hardware and software potential made available with the platform

Has Computer Innovation Plateaued?

For the past 40 years, computer innovation has been solely driven by a demand for greater accessibility. First, computers became smaller, so that they could fit on your desk. Then, their operating systems became easier to use, weaning off of texted based interfaces and adopting far more user friendly graphical ones. After that, computers became even smaller, so we could viably take them any where in a backpack or pocketbook. Next, they became connected to one another with the advent of the internet, revolutionizing global communications and making data more accessible than ever before. Most recently, they became small enough to fit into our pockets, and simple enough to be controlled solely by our hands, without the need for any peripherals in between. Through all of these advancements, computers have become more and more accessible, both in terms of ease of use and availability, but now, many are quick to claim that this aforementioned rapid innovation in the computer space has stagnated, and the well that is computer innovation has run dry. However, this is not the case, while the rapid innovation in the computer space is definitely not as visible as it was around a decade ago, it certainly hasn’t stopped. What has changed is the goal computer innovations are being made in pursuit of. The last few decades’ goal of widespread computer accessibility has largely been met, with more people using computers than ever before, thanks to these aforementioned innovations. While the computer innovation well has not run dry, what has, in reality, is the computer accessibility innovation one. Now, as I said previously, computer innovations are being made in light of a new goal: integration. A majority of the computer innovations made across the past decade have been made to help integrate computers into more fields. AI advancements are made to push virtual assistants into our homes through smart home devices. Machine Learning is being used to put more powerful computers in our cars, with the ultimate goal of self driving capabilities. These goals and advancements are built upon those made by innovators who worked to make the computer better, and now computers are being used to make every aspect of our lives better. So to answer this question: “Has computer innovation plateaued?”, no, it has not, it has simply become part of a larger system: human innovation,