For the past few years, with every iteration of the iPad Apple releases, they take more steps towards bringing the iPad closer to the modern perception of a computer. Last years iPadOS completely changed the way that iPad is used, finally adding the much requested external storage support, desktop class browsing, and multitasking only rivaled by full fat PCs. However, Apple’s most recent step towards developing the iPad as a laptop replacement has garnered as much backlash as it has support. Apple’s recent choice to add native mouse and trackpad support to the iPad is a bold move and a massive step towards making the iPad a viable laptop replacement for many, and its implementation has come with criticism over what could be viewed as a betrayal of Steve Jobs’ original vision for the product. When Jobs introduced the iPad as a “magical piece of glass”, the device became synonymous with touch input, and that touch input became synonymous with the versatility that would hopefully allow the iPad to lead the post PC world. What made the iPad so special 10 years ago and what still makes it special today is the sheer magic and versatility that comes with it. You can use it anywhere you want, with its casual input methods, small and light profile, and comfortable design, you can use it anytime you want with its all day battery life and always connected abilities thanks to cellular options, and you can use it any way you want, with its versatile touch-centric OS that allows for capabilities and applications that are simply not possible on conventional computers. This last point is why mouse and trackpad support on the iPad is not a betrayal of Steve Job’s vision for it, but an advancement and reinforcement of it. The backlash that comes with this announcement comes not because of a mistake on Apple’s part, but a misunderstanding on critics part. Those who attack trackpad and mouse support in the name of protecting Jobs vision actually have the opposite effect. Touch input is not what makes the iPad special, versatility is. On top of this, the introduction of Trackpad support isn’t changing the fundamental idea behind the iPad, or what Steve Jobs envisioned it to be. First and foremost, the iPad is still a touch device, mouse and keyboard support just provides more functionality and versatility. The versatility of the iPad is what makes it so special, it doesn’t define the way you use it, instead, it is defined by the way you use it, and another input method in mouse and trackpad support allows you to use all new workflows not possible or productive with touch input, before, simultaneously reinforcing Job’s vision for the iPad and realizing that vision even more than before.
The iPad mini is still the stupidest piece of crap on the planet though, but that’s a rant for another day.
When Alan Turing refined what would become the first iteration of the modern day computer way back in the late 1940s, he had no way of knowing the impact it would have on the world future generations would inherit from him. With the help of many great thinkers, the computer has truly infiltrated every corner of our lives since Turing’s initial model, defining many aspects of everyday life. One aspect of life impacted by the computer’s rise to prominence in a particularly large way is the way we work and the way our work interacts with the rest of our lives. Through multiple seperate avenues, the computer has worked to define the way that we work and simultaneously blur the line between our professional and personal lives. One way that the computer has done this that is more observable than ever right now it’s allowance of remote working. The ability to work from home is entirely reliant on the computer, and what has now become the way that 43 percent of Americans work (according to CNBC) would be impossible without the presence and prevalence of the computer. Another way that technology has changed our work culture is through email. The ability, a benefit or a curse, to maintain constant contact with colleagues has truly had a monumental impact on the way we work. Furthermore, this ability has further blurred the line between what’s work and what’s not by bringing work related concerns and conversations into our homes, which was, for the most part, a safe haven of personal life guarded from our professional ones up until now. Through these means and many more, technology has had a truly massive impact on the way we work and the way our work is related to the rest of our life.
With the increased viability of remote working that comes with the greater access to computers, working from home has never been easier or more prevalent. Despite this, a majority of America’s workforce commutes to an office or other business facility to work, even though this is increasingly becoming more unnecessary and in many cases, less cost effective for both employers and employees than working from home. While there are certainly some industries and professions that are entirely incompatible with the practice of working from home, this is certainly a minority, and a large percentage of jobs could be completed by working from home. Recently, we have seen an upward trend of Americans working from home, but we still have a long way to go before the economic and productivity benefits of this system can be fully realized. However, a recent emergence has prompted a shakeup in American work culture: the outbreak of Covid-19. In the name of precautions being taken to deter the spread of the virus, more and more workplaces are temporarily shutting down their physical offices and workspaces and pushing their employees to work from home. With more and more Americans working from home, we will be able to see a rare glimpse of the effects of a large population of remote workers on the economy, society, and culture of America.
In the past, luxury has almost constantly been synonymous with high prices. Whether they be backed by perceived, and often superficial craftsmanship, design, or quality, the greatest obstacle, and selling point, for luxury products has always been what they are selling for. But today, luxury has begun to take on a radically different definition and identity than its counterparts in previous generations. Now along with shedding the association with classy, bespoke design and sophisticated mannerisms for more casual, often futuristic inspirations, luxury has eliminated the price point obstacle for a new one, one more relevant to todays day and age: scarcity. In a world where virtually all information is at our fingertips and just as much personal expression occurs in the digital world as the physical world, the hindrances and benefits of limited resources and availability are less present than ever. To balance this equation, luxury has taken on a new form, and it is paid for with a new currency: time. While some would argue that time has always been as valuable as any currency, this ideology is more convincible than ever today, where life is lived in milliseconds thanks to the advent of the internet and the ever-connected and interconnected world it brought with it. Now, time, or more precisely dedication, is just as valuable a currency as silver or gold, and an even more valuable one when dealing with modern day luxury. Brands such as Supreme, Kanye West’s Yeezy, and others have achieved clout and household names among the ranks of Gucci and Louis Vuitton by offering new means of paying for their products in dedication. Owning and representing one of these companies products demonstrates that you did more than just spend your money on them, you dedicated time towards acquiring them, whether that be waiting in line outside a physical store, or trying-like thousands of others- to purchase them online, often to little or no avail. Money is a dying symbol, time is ever valuable, it is just as limited as gold or silver, and there is no concrete value attached to it, making it technically invaluable, and the luxuries bought with it just as much of a symbol of wealth-what kind of wealth is a different matter-as luxury bought with dollars.
When Steve Jobs introduced iTunes, he brought up how he believed people wanted to feel a connection their music, in that case by owning it, hence why the music streaming services of the time didn’t see much success. But today, it’s easy to see why some would see Jobs view as wrong, with the overwhelming grasp that subscription based services such as Spotify or Apple’s own Apple Music hold on the music consumption market. However, Jobs’ argument can still be seen as an accurate one today, as the way we establish connections with the things consume has evolved with the ways we consume them. With live in an era of instant and constant feedback. If you want something, it’s only a click away. The subscription service goes hand in hand with this way of life, both for users and for producers. For users, the subscription service follows this pattern by offering constant and instant feedback to users with vast libraries of content and instantaneous access allowed through streaming. For producers, a subscription allows for constant, steady, and continuous profit via a recurring revenue service, while simultaneously providing instant profit due to the the thousands of transactions being completed between the producers and users. These mutual benefits are what made the subscription service popular among users and viable among producers or manufacturers. Some argue this approach does bring with it a downside, though, that being the damage to the connection between the users and the product due to the lack of feeling of possession. However, I contest this argument, as I don’t think people really form connections with their tech possession the same way they do with their non tech possessions, due to the limited life span of the former. I think that people form connections with tech products, including streamed shows, music, and more, as it’s heard to form a connection with something that has a fluid, ever changing look, feel, and method of interaction. I think that people form connections with products through the user experiences they encounter on them, such as meeting a new friend on SnapChat, finding a great new album on Spotify, or building a cool world in MineCraft. These experiences, not ownership of the products they are found within, are what helps to establish connections between user and product, and instead of making them more scarce, these products have helped to build connections by making the experiences that lead to them more accessible.
While many think that Apple’s success comes from defining new industries with revolutionary ideas, in actuality, what really accounts for Apple’s wild success today is a pattern of redefining preexisting industries with great products. For example, Apple didn’t create the smart phone with the iPhone, that market and product already existed, Apple simply made it way better and defined the product as what it still is today. The same is true with other stand out products like the iPod and the original Mac, both the MP3 player and personal computer respectively were around before before these products were introduced, but both product fields were drastically different afterwards. Recently, Apple had the opportunity to redefine another industry, one with much less clear ties to tech, but astronomically important roles in it. The Apple Card could’ve done what the iPod, iPhone, and Mac did before it: redefine an industry, and in this case that industry was banking, but so far, it hasn’t, so why not. One important detail to recognize about the success stories of the iPod, iPhone, Mac, and iPad redefining their respective industries is that, prior to their release, each industry lacked several key traits that made disruption much more attainable, most notably: a good user experience. With each of these products, Apple brought a good user experience and other missing pieces to their respective product fields, bringing with them tremendous success and market disruption. Like the other pre-Apple disruption fields, banking is severely lacking a good user experience. Especially in recent years, banks have been hurting their users through actions like decreasing interest for bank accounts and increasing it on loans, on top of all of the hidden fees, fine print, and nauseating contracts that have become synonymous with banking. If any company could revolutionize banking and make it easier and more accessible than ever, Apple could. But why didn’t they with the Apple Card? The answer is simple: they partnered with a bank! Instead of providing credit services by outsourcing them from Goldman Sachs, Apple should have taken a different approach. Apple has proven itself to be a tremendously profitable investment, recently earning the title of the most valuable company in the world after reaching a net worth of one trillion dollars. Apple should have used this to their advantage with the Apple Card, rather than offering banking services through a third party, they should have maintained everything themselves, backing interest with stock price and utilizing the money stored to fund projects. Such a move could make it significantly more easy and secure to invest in the most valuable company in the world, while also revolutionizing the banking system and making it significantly easier to deal with. This move could bring Apple fully into a new industry and simultaneously redefine it, the same way that, with the iPhone, the company entered the smartphone industry and made themselves an invaluable player in it.
A lot of times the company that tries to be the most revolutionary in the field of producing its respective product ends up being the least successful. This struggle of great concepts failing as products is prevalent in the field of electronics, where new innovations that could change the way we use technology come around incredibly frequently. However, even though these innovative new technologies could push a shift in our usage patterns of technology, the often don’t, typically following a cycle starting with a brief hype period after the showcase of a new technology , succeeded by a handful of products from hopeful manufacturers, followed by a brief period of profitability and then inevitable market failure, concluding in a halt in manufacturing and a dismissal of the initial idea as a success due to a justified lack of confidence. One example of this cycle in play that you can witness at this very moment is that of the folding phone. The technology garnered tremendous hype and speculation, but when Samsung was the first company to finally bring it to consumers, their product flopped, and all companies that followed Samsung have as well. But why is this? Why do “revolutionary products” often fail even if at heart they are great ideas? The answer is simple, one word simple: execution. The success of any product, innovative or iterative, relies not on the initial idea but on the final execution. An idea is meaningless, what you can do with it is priceless. In the case of the folding phone, the idea is great, but every execution of it to date has been met with shear failure. Companies haven’t spent the time and effort to make a great product, instead pushing often faulty, more expensive devices with inferior user experiences to those already widely accessible. It really comes down to putting in the dedication to make something great, and if it can’t be done, if the technology isn’t ready yet, then don’t do it. Innovation isn’t just thinking of a great idea, it’s turning that idea into an equally as great product, and if you can do that, you can take over the world.
Whenever developing a new product in the field of consumer electronics, or any field for that matter, one inevitably encounters the following the debate: should I make my product the best it can be, or the best it can be for cheap? Those last two words are the sole reason this conundrum exists, and they have been the central point of countless meetings, conference calls, and college lectures since the American industrial revolution. However, there is something fundamentally wrong about this debate, and in fact, the debate itself shouldn’t even exist. This battle between the best possible user experience and the most accessible price tag stems from an idea of two different facets of accessibility: accessibility through affordability, and accessibility through ease of use. Seemingly, these two accessibilities are parallel and completely incompatible, at least the countless companies that have faced this struggle would have you believe it to be that way. Typically companies approach choosing one of these accessibilities to focus on by determining what approach would better suit their product, usually coming to a crude conclusion by asking themselves a question along the lines of: do I want my product to find itself in the hands of a ton of somewhat content users, or a fewer number of happier ones? If the is the former choice, than the better choice would theoretically be to create a cheaper product, one more accessible to a wider range of users due to a price more of them would be able and willing to pay. On the other hand, if the answer to that question is the latter choice, then theoretically the better route to take would be creating a more refined and, as a result, more costly product. However, the reality that product developers who follow this strategy fail to pick up on is that you don’t need to sacrifice one of these accessibilities in favor of the other, you don’t have to spite your nose to save your face. You don’t need to spend exponentially more to get a better, more refined product, you just need the right people. While yes, one avenue of acquiring great employees is through enticing them with flashy benefits and bigger paychecks, both requiring increased expenses, you can easily reach the same result with a great product. If you feel that you have a great product, and you can convince others that you do, then the people who are working on that product will pour their heart and soul into it. This is another misconception in the field of product development that is appalling to me: the notion that the most important people to sell your product to are the press, when this is far from the truth and really, the most important group of people to convince are your employees. If you truly see greatness in your product, and you can convince people that are working on it of its greatness too, you’ll never need stock options, a kombucha fridge in your office, or one month of paid vacation days, if you can convince the right people of your product’s greatness, they’ll put everything into ensuring it’s success, and you won’t have to spend an extra dime, then and only then, you can sell the world on your product, and make it truly accessible, in all definitions of the word.
No. In fact, the computer will replace the smartphone. This is thanks to the superior productivity and creativity applications available on the computer, that require the computer’s (whether it be a tablet, laptop, or desktop) specific form factor to be used effectively. The phone isn’t the evolution of the personal computer, it’s the extension of it. It’s meant to perform the tasks that it was initially designed to perform, namely light web browsing, messaging, and social applications. The introduction of AR will one day negate the need for a phone, and it will impossible for it to do so without the PC, likely using the PC to harness processing power not feasibly available in a chassis small enough to be a pair of glasses. So no, the phone will not replace the PC, and instead the reverse will happen.
Whenever a new Apple product comes out, your’e sure to hear everything about it for days. If its a phone, you hear about the fancy new camera, for AirPods, the noise cancelling, for the MacBook, the new keyboards, but for all of these products, the thing you seem to hear about the most is always the price. Apple is a premium company, and their name isn’t exactly synonymous with cheap. But many think that Apple’s products, especially in recent years, aren’t just expensive, but overpriced. It’s not hard to see why some people have this sentiment, two hundred and fifty dollars for a pair pf headphones or one thousand for an iPhone isn’t what you would call cheap, but calling these products overpriced is another story. The detail that many Apple naysayers omit when deeming the company’s products overpriced is that when you buy an iPhone, MacBook, or AirPods, your’e not just paying for the phone, laptop, or headphones, your’e paying for the whole user experience. Apple products hold with them the guarantee of a user experience that is a cut above the rest, and one that carries with it a higher price tag. This elevated user experiences is pricy for a company to develop, with countless hours, dollars, and assets poured their designs, research, and developments. This extra effort has allowed for user experience breakthroughs such as the seamless integration between AirPods and iOS devices, or even farther back with the original Mac and it’s graphical user interface. So when you read that it costs Apple two hundred and fifty dollars to assemble a phone they charge one thousand for, your’e not seeing the whole picture. It’s not just the assembly, its the seamless nature of the software that it runs off of, the research behind it’s innovative new features, and the breathtaking design that embodies the device. All of these factors and the processes, time, and people behind them make Apple’s devices more costly to develop then their competitors, but they’re also what makes them better, and they truly make and Apple product and Apple product.