Amazon and Google Will Destroy Themselves, Here’s How:

Focus. Any Silicon Valley guru will tell you it’s the most important trait a tech company can have, and it’s true. In a world where tech can be applied to virtual anything, focusing on one sub-sector of it is often the key to success for a tech company, all the way from the startup to monopoly level. For the last few years, however, some of the biggest tech firms in the world have seemingly begun to ignore this fact, most notably: Google and Amazon. Over the past decade, these companies have strived to build ecosystems around their products and platforms, with the intent of locking users into their platforms and making it difficult for them to leave. Apple employs a similar practice, which can be seen in products such as their AirPods, which have seamless integration with the rest of their products, which in turn have seamless integration with each other. However, where companies like Google and Amazon are different than those like Apple is in the services and products they use to enforce their own ecosystems. While all of Apple’s ecosystem-reinforcing products seem to make sense and fit into said ecosystems nicely and clearly, Amazon and Google’s do so to a far lesser extent. For example, one way Amazon builds up its ecosystem is by building products for both enterprise and consumer markets, but these products don’t have clear lines that can be drawn to one another, and instead loosely fit into an overall ecosystem that is built out of much smaller ecosystems. This all has to do with the fact that, in this day and age, monopolies are essentially legal, and companies like Amazon and Google take advantage of this, spreading into as many different markets as they can. And they can afford to do this too, as even if they loose money in one category, they can subsidize those losses with the gains from their more successful products and categories, giving them a substantial advantage over smaller firms. But just as this practice gives companies like Amazon and Google a key advantage over smaller firms, it also means that they have a substantial weakness when compared to them as well, they can’t focus on that one subcategory as smaller firms can. And as companies like Amazon and Google grow and spread into more and more markets, their focus becomes more and more diluted, until they can’t apply sufficient focus to any of the markets they occupy, allowing smaller firms that can to beat them out and take over, slowly eating away at these monopolies until there is nothing left but a stumbling husk carrying the name of a company that once was synonymous with domination, domination that was destroyed by that company’s own lack of focus.

Here’s How Digital Assistants Should Work.

5 years ago, Google and Amazon were touting the digital assistant as the future of consumer electronics. The promise of a smart assistant deeply integrated into your home seemed enticing, with countless possibilities and a more simplified user interface, it seemed that the digital assistant could become a truly massive new platform, up there with the likes of those built by the smart phone and personal computer. But 5 years on, it’s not difficult to see that Google and Amazon’s dream for the digital assistant hasn’t really taken off. While digital assistants have certainly become more popular, they haven’t become much more powerful, and due to this we’re largely using them for the same things we were 5 years ago, that is: asking what the weather is and queuing Spotify playlists. But it doesn’t have to be this way, digital assistants can be truly useful and powerful tools, tools that could greatly simplify our lives and the technology they are integrated with. But to accomplish such a level of usefulness would require more than the efforts of just Amazon, Google, and Apple, it would require a new digital assistant “operating system” of sorts, with different developers integrating their own digital assistants as applications within those “operating systems”. This approach would allow digital assistants to have much deeper and tighter integration with third party applications and services, while still giving the companies behind the digital assistants control over their respective platforms. Digital assistants aren’t going to get any better by their companies efforts alone, and they will only reach their true potential with the combined effort of thousands of devlopers working for hundreds of companies.

Should Everyone Learn to Program?

As the computer has become more and more ubiquitous over the past few decades, program has become more and more popular with it. This popularization is immediately visible, too, we’ve seen school after school invest in computer science programs, dozens of apps that revolve around making programming easier and more understandable have flooded app stores across all platforms. Needless to say, programming is more popular than ever, but should it become so popular that everyone should learn it? Should we put as much of an emphasis on learning a programming language as we do learning a primary language, such as English? Well, not exactly. While programming is certainly extremely important to the advancement of technology, that doesn’t mean that everyone should learn it. Learning programming is more like learning an instrument or even trade than another language. That’s because learning a programming language is typically done for the sake of making something with the language you’ve learned. When you learn a world language, you typically do so to communicate. When you learn a programming language, you do so to make something. The type of work that goes into learning a programming language, while certainly homologous to that of a world language, is much more analogous to learning an instrument. I guess what I’m trying to say is that programming requires a specific mindset, the way an instrument does, and that means that we shouldn’t try to force it on everyone, but we should certainly offer it. On the other hand, the computer science classes that more and more schools are developing are certainly important, if not for teaching computer skills, than for teaching computer literacy, which is getting more and more important as technology becomes more and more intertwined with our everyday lives.

AR Headsets or Smart Watches? Wearables and Which Ones Will Rule the Future.

Over the past few years, the world’s biggest tech firms have scrambled to become players in the wearable tech scene. First, Android phone manufactures including the likes of Samsung and Huawei fought to beat Apple to market with their own smart watches, which hit the market and gradually faded to obscurity as the looming release of the very thing they fought to beat, the Apple Watch, drew closer and closer. Today, Apple is the undisputed king of the wearables sector, with their Apple Watch and AirPods growing in popularity more and more everyday as they eat up more and more of their respective market shares. But now, as more firms jump on the wearables train, the loosely defined industry has slowly begun to shift it’s focus-no pun intended- to a new form of wearable technology- not one that sits on your wrist, but rather on your face. For decades, AR headsets have been relegated to props in sci-if movies, but recently, the futuristic and theoretical technology behind them has inched closer and closer to becoming a reality. But as this technology becomes more and feasible and begins to be used in commercial products, what will happen to the smart watch? My theory is that the smart watch will function similarly as it does today, as a companion, rather than a stand alone device. However, what will be different for the smart watch as other wearable technologies come to market and rise to popularity, is that the smart watch of this fully wearable future will function more as a part of a greater wearable system than as a companion to more fully fledged to device. AR glasses won’t be able to replace the smartphone alone, but in tandem with smart watches, wireless earbuds, and more, it will lead us into a wearable future.

Should Apple Make an iOS Laptop?

As time goes on, the iPad and the MacBook have grown closer and closer to converging. Arguably the greatest step towards this convergence was made recently with the addition of trackpad and mouse support to iPadOS, which gave the iPad one of the key features of a fully-fledged MacBook. Following this path, Apple is slated to release a new MacBook with an ARM processor next near, which would bring the two platforms closer together than ever before through unifying their processors architectures. But as the iPad inches closer and closer to becoming a fully-fledged laptop, the question begins to arise: should Apple just make the iPad a laptop that runs iPadOS. While this certainly seems like one logical evolution for the iPad in its current state, I don’t personally think that the iPad needs to adopt the form factor of a laptop to be able to replace a laptop. Having used the new Magic Keyboard with trackpad for a few days now, I can firmly say that the iPad feels more like a full laptop replacement than ever before, and I can also firmly say that I am happy with where the iPad is currently at in terms of being a laptop replacement. You see, the most “magical” part of the Magic Keyboard isn’t that it lets the iPad be a laptop, it’s that it lets the iPad be an iPad when you want it to be. The thing that makes the iPad an iPad is its versatility, and the new Magic Keyboard perfectly supports this versatility-based device. If you want to use the keyboard and effectively have a fully functional laptop, its as simple as attaching the iPad to the top of the keyboard and snapping it in place, and if you want to use the iPad as an iPad, all you have to do is pull the iPad away. This versatility is what makes the iPad so great, and to destroy it in the name of making the iPad lighter and more lapable would be a disgrace to Steve Job’s original vision for the product and the effective death of the iPad in the eyes of many.

Does the HomePod Have a Reason to Exist?

When Siri launched with the iPhone 4s in 2011, Apple promised it would be it would take over the world. But its competitors quickly caught up. Within a few years, Google, Amazon, and even Microsoft all had their own digital assistants on the market, and, what’s more, they were better Siri. Today, the smart home is a rapidly growing market, with thousands of devices from thousands of companies, forming a tight-knit ecosystem with one device at the center: the smart speaker. As the smart home market began to emerge, Google and Amazon quickly got to work on their own smart speakers, soon after introducing their Google Home and Amazon Echo speakers respectively. However, Apple took more time to produce their offering, and when Apple’s smart speaker, the HomePod, finally came to market, Google and Amazon were already well established forces within the space. To add insult to injury, Siri had not improved much since its initial release, whereas Google’s assistant and Amazon’s Alexa had continuously received update after update, making Apple’s Siri pale in comparison. So the HomePod had an uphill fight leading up to its release, and, when that release finally came (after one delay), the Apple HomePod was met with mixed reactions. Consumers gawked at the hefty price tag, which put the HomePod several hundred dollars above its competitors, with notably less functionality. Even today, the HomePod lags behind Google’s and Amazon’s offerings in terms of sales, and while it wasn’t a flop, it can’t really be considered a massive success. So does the HomePod have a reason to exist? Well yes, but that doesn’t mean you should by it. It all has to do with the reason Siri is so “bad” in comparison to Google Assistant and Alexa, and, as an extension, why the HomePod is lacking so much functionality in comparison to the Google Home and Amazon Echo. The reason why these other digital assistants and the speakers they run on are so powerful and fast is because Google and Amazon use user experience and data to improve them, completely compromising their user’s security and privacy. Apple doesn’t do this, and this is why Siri lacks so much functionality in comparison. We need the HomePod not for what it is, but for what it represents. The HomePod is the last secure smart speaker on the market, and if we continue to use it, it will get better, until it is one day just as fast as Google and Amazon’s speakers, and ten times as secure.

The Deeper Ways Covid-19 Will Change the Way We Live.

The spread of Covid-19 has plunged the world into unmitigated chaos. As of May 1st, The deadly virus has claimed close to a quarter of a million lives, completely disrupting our way of life in the process. But Covid-19 isn’t only killing us, it’s slowly eating away at well established institutions, both in America and the world at large, and when this whole epidemic is over, the world we live in will be a radically different one. One area most affected by the spread of Covid-19 is the workplace. More than ever before, American’s are working from home, and when this is all over, many aren’t sure they’ll want to go back to the old way of life. Just on the surface level, such a massive shift in the way work would have massive implications for the way we live, more Americans working from home could mean drastically different societal roles, and the lessened presence of the office could completely overhaul real estate. On top of the differences in how we’re living, we’re also witnessing a colossal paradigm shift in where we’re living. Thousands are fleeing big cities to escape the firm grasp the virus holds over them, and experts are saying many won’t come back. Especially in the case of New York, this event has expedited the already growing trend of people leaving big cities for smaller suburban areas. The common factor between these two shifts in our culture, working from home and leaving big cities, is not that they were kickstarted by the spread of Covid-19, but that they were preexisting trends expedited by the spread of Covid-19.

Why Apple is Failing at Services.

If you ask almost anyone in Silicon Valley, they’ll tell you that services are the future of the tech industry, and it’s not hard to see that they truly believe that. Subscription services are everywhere today, they’re behind the way we watch tv, listen to music, edit photos, and even do our work. Tons of companies have found financial success through this business model, as it practically guarantees recurring revenue, the thing that investors want to see most, but despite this, Apple continues to see diminishing returns and subcscriber numbers with their own services. But why is this? How come one of the biggest tech companies in the world that has built an empire off of making great products failing where countless others have found great amounts of success? Well, the answer is simple. To understand why Apple is failing at services, one first must understand why they have succeeded in all of their other ventures. From the very beginning, Apple has succeeded by doing things differently. They were willing to take risks in the name of innovation and furthering the user experience, risks that allowed them to find success in innovative new products like the iPod, iPhone, and iPad. But when in comes to services, Apple’s don’t stand out all that much when compared to say Netflix’s, Spotify’s, or Hulu’s. They don’t offer any improvement on these services, only iterations, iterations that are frankly worse than the services they are iterating on. If Apple wants to succeed at services, they have to make Apple services, not HBO with an Apple skin on it, or Spotify with Siri support and a genuinely terrible user interface, they need to make the iPod’s and iPad’s of services, innovative new products that offer better user experiences than and genuine improvements on their competitors’ offerings. To put it simply, Apple needs to think different.

Is Microsoft the Next IBM?

In the 1980s, IBM was the undisputed king of the personal computer space. Although they had a late start, the IBM PC quickly gained market dominance over competitors like Apple and RadioShack, landing on the desks and in the homes of thousands of people both in America and around the world. But today, IBM doesn’t make personal computers anymore, and they haven’t for nearly fifteen years. Today, IBM is known for their enterprise products, mainly selling mainframes and servers to businesses, with little to no consumer facing business practices, and less of a household name than ever before. In the 2020s, Microsoft is the undisputed king of the personal computer space, at least in terms of users, that is. With a market share of over 75% globally, they have near complete control over the desktop and laptop comuter industry, with their closest competitor, Apple, having a meager 10% market share in comparison. But recently, Microsoft has put more and more focus on their enterprise products, including Azure, a cloud computing platform much like those offered by Google, Amazon, and, of course, IBM. This shift is eerily similar to that of IBM’s in the early 2000s, with both seeing primarily consumer facing companies completely pivot and become business facing ones. IBM was forced to do this, as their operating system was vastly inferior to and lacked the user base of Microsoft’s Windows, as they had failed to hop on the train of graphical user interfaces on time. This is similar the difficulty Microsoft faces now, having failed in the mobile computing space with the Windows phone, which came to market far to late to pose any sort of threat to the well established iOS and Android platforms. If Microsoft continues to fail to keep up with the curve, they will be forced to rely on their most profitable products, their enterprise products, and if they do that then, like IBM, they will fade from public view, and eventually become just more than a footnote in the annals of technology history.

5 Years On, was the Apple Watch a Success?

The Apple Watch was the first big new product line for Apple since the death of Steve Jobs, and, needless to say, it had a lot riding on it. Leading up to its announcement, speculation around the Apple Watch ran wild, with many heralding it to be the next iPhone. But now, 5 years later, has the Apple Watch lived up to those lofty expectations? Well, yes, for Apple it has at least. When the Apple Watch was finally unveiled, it was met with some deal of disappointment. Some of this disappointment was understandable, as the Apple Watch could never live up to the insane amounts of hype surrounding it, but, on the other hand, some of this disappointment was definitely warranted. Ever since its release, the Apple Watch has felt more like an iPhone accessory than a stand alone device, with it falling closer to Apple’s AirPods than to its iPhone in terms of functionality and impact. But this doesn’t mean that the Apple Watch has been a letdown by any stretch of the imagination, it’s just not what we expected it to be. The Apple Watch is part of a broader technological future, wearables, where our computing is relegated into smaller, simplified computers that we interact with in more subliminal ways. The Apple Watch was never meant to be a stand alone device, it just wouldn’t work as one, instead, it is meant as supplement to preexisting devices, one that simplifies and advances the user experiences of those preexisting devices overall, most notably in the way in negates the need to use those other devices for the simpler tasks that the Apple Watch can complete. Like I said, the Apple Watch is a device for the future, where our computing needs are divided and spread out across multiple computing platforms, platforms like AR headsets, wireless earbuds, and, currently, smart watches like the Apple Watch. While this future isn’t quite here yet, for now, the Apple Watch offers an excellent companion to the iPhone for its health and fitness capabilities, messaging and calling applications, and more simplified tasks that negate the need to look at your iPhone.