We need an open source, easy to contribute to, virtual assistant.

I typically don’t support open source technologies. They are typically unpolished, inferior products that lack good user experiences and have unfocused visions and purposes. However, in the case of virtual assistants, I think the field could benefit greatly from an open source offering. The problems with most virtual assistants come from their privacy concerns, like in the case of Amazon or Google, or inferior databases, like in the case of Apple’s Siri. An open source virtual assistant could curb these setbacks by allowing supporters to cater to the assistants database and do so according to their own will. This would allow both users and manufacturers to develop apps for the assistant and integrate their products directly with it, specializing the features the assistant would offer when used with their products, while also allowing users to create a more personalized, user friendly personal assistant that could cater to their specific needs in ways that other assistants couldn’t. Eric Von Hippel, an expert on user innovation emphasizes the benefit of user innovators’ (users creating and contributing to something for their own gain) ability to make products that function for them, whereas manufacturers and companies are forced to take a one size fits all approach to maximize reach and market share. I think that, while a one size fits all approach to products is a good one in many situations, in a situation where the product is supposed to be tailored for the user, a broader data contribution range and information database that is donated voluntarily provided by users could lead to unprecedented success in creating a more user friendly and capable virtual assistant.

Analog Concepts in a Digital World part 3: Centralization

One key prospect of the internet, upon its arrival, was the advent that it didn’t belong to anyone, but to everyone. However now, many are concerned that the biggest players in the centralized internet (social media) now own the web, and if you wish to gain any audience, you must sing in their hall. So, is the future of the internet centralized? Will there be a decentralized revolution? If there is, what does that mean for the user experience?

Analog Concepts in a Digital World Part 2: Scarcity

With the advent of the internet and the virtual world, scarcity is one aspect from the physical dimension many had hoped to leave behind. But alas, scarcity has meandered its way into the digital realm, and introduced another barrier into what could be an endless plane of possibility. It started out as Digital Rights Management, to ensure fair compensation for content, but quickly exploded into crypto currency, essentially the most sugar coated form of manufactured scarcity. So the question is, will scarcity put a cap the limitless possibility of the internet, or will it tear it open further?

Analog Concepts in a Digital World Part 1: Animosity

Why do we wear masks? There is an inherent seduction that comes packaged with the mystique of a secret identity. We feel empowered by secrecy, it is like a drug, making us feel as if we can do anything, because no one is looking. The KKK used animosity as a weapon through hiding behind a mask and committing treacheries under the backdrop of the moon and acting like different men with the sun’s rise. Technology, like a mask, empowers us through secrecy, we can attack, hurl vicious insults and destroy people with a keyboard and mouse, without them ever seeing our face. Virtual secrecy is a danger often left unchecked, as in the past, fear of public humiliation or shame for attacking others, verbally or physically, has kept us in line. But now, technology enables the deepest levels of secrecy, and people use this to their advantage to wage virtual war with bombs and planes of derogatory remarks and insults. So the question is, will we become less sensitive, or more power high?

The future of computing through fractal geometry.

Fractal geometry, to put it simply, is the fourth dimension, it is supplementary to the first three or “euclidian” dimensions, length, width, and height, and it describes the space in between the first three dimensions. For example, a euclidian definition of a mountain would be a cone, as that can easily be translated from its length, width, and height, whereas a “fractallian” definition would describe it based on its components. This is done as pertaining to the three principles of fractals: self-similarity, recursiveness, and initiation, which state respectively, that each component of a fractal is similar to the whole fractal, and that the self-similarity is infinite in detail, going on forever. So how do I go from writing about iPads and the future of computing to this, I have to state that such a jump is not that far of one. You see, in my mind, fractals hold the key to the future of computing. A key element of fractals is their property of infinite detail in a limited amount of space, an element that could be translated into computing quite clearly. The use of standardized data utilizing minuscule variations to such data could allow for self similar binary (1s and 0s for the non tech savvy) that could be infinite in detail while still being finite in the amount of space it takes up. This would allow for infinite storage and memory, which would simultaneously expedite the convergence of memory and storage, and null another limitation to compute power. However this concept fringes on the translation of Mandelbrot’s formula (Z [the complete fractal, think the complete triangle from pascals triangle]= Z * C^2 [The little triangles multiplied by their amount]) to binary, which would be done in a similar way into its translation into color, where an integer for Z translates to a specified color as it inches towards infinity and a rational number with a decimal translates to black as it inches towards zero. Once this could be translated, a major barrier in computing could be broken, and major technological leaps could be made.

After 10 years, can we finally call the iPad a computer?

Steve Jobs with the original iPad
Image via Digital Trends

10 years ago on this day, the post pc era began. At least according to Steve Jobs it did. 10 years ago, Steve Jobs sauntered onto a San Francisco stage to announce possibly the most anticipated product in his company’s history: the iPad. To Jobs, the iPad was more than just a tablet, it was the final nail in the PC’s coffin and the first glimpse at the future of computing. But 10 years after its unveil, many are still hesitant to call the iPad a computer. This has nothing to do with the iPads functionality today, and everything to do with the iPads functionality 10 years ago. When the iPad first came out, it seemed to be focused on content consumption. Its reveal keynote highlighted its potential as an e reader, or its video streaming prowess, rather than paying attention to applications in writing, business environments or content creation, all fields that were dominated by the PC. While its obvious why the iPad was so centered around content consumption, with the tempting ease of advertising the device as a portable tv or a slate that lets you carry the Library of Alexandria in your backpack, these weren’t the features people were looking for out of a tablet, they wanted Word, Photoshop, GarageBand, and other apps previously exclusive to PCs and Macs. This association between the iPad and content consumption is still strong in the minds of many today, and is the key reason why so many people have such a difficult time seeing the iPad as a computer in the same way they see their laptop or desktop as one, where in reality the iPads of today are entirely different beasts than the first ten years ago. Modern iPads check all the boxes of a computer, they have physical keyboards, file managers, productivity and content creation applications, large screens, and other features key to a computer replacement, so yes, the iPad can be called a computer, and it should be, but that doesn’t mean that everyone will.

Is it too early to ditch the iPhone’s last port?

The Lightning Port
Image via CNBC

Apple is known for making its devices as simple as possible by taking away what they deem to be unnecessary additions. This policy goes all the way back to the Mac, which didn’t feature arrow keys to push users to use the mouse. More recently, it can be seen in the removal of the headphone jack on the iPhone and the narrow port selection on current MacBooks. While this practice has come under fire, it has allowed for the adoption of more futuristic technologies, such as the aforementioned mouse or wireless earbuds like Apples own AirPods, which wouldn’t have been nearly as popular as they are if you didn’t have to use a dongle to connect regular wired headphones. However, we might not be ready for the next step in this line of removing complications from our devices. According to multiple reports, the one with the most weight coming from CNBC, Apple could ditch the lightning port for wireless charging only power delivery on the next iPhone. Such a move would certainly be true to Apple’s history, but are we ready yet? Wireless charging has seen some widespread adoption, and the technology has some clear benefits over the conventional charging over wire, such as being easier to charge your device, easier to remove it from a charger, and more sleek design. It’s also important to note that, unlike the case of the removal of the headphone jack, the removal of the charging port would not be that big of a loss. With the massive batteries in iPhones these days that allow them to last all day without needing a charge, one of the major advantages of conventional charging is eliminated, that being the portability. You don’t really need a charging port when you have a wireless charger, which will presumably be included with the new iPhone in place of a conventional charging cable and brick, and when you don’t need to top off your phone, you wouldn’t need a charging brick and cable to bring with you. Sure, when going on road trips you might need a wired charger to charge your phone, but I foresee new cars coming with built in wireless charging pads and for the right now, many companies make portable wireless chargers. I think, just like in the case of the arrow keys, headphone jack, any countless other examples, taking something away would give be giving you an experience that is that much greater.

Why hasn’t Apple made a folding phone yet?

The Samsung Galaxy Fold, recently released after a delay due to a faulty design
Image via Android Central

Last week the Consumer Electronics Show-Or CES for short-was held in Las Vegas. Tons of future tech products were shown off, from 5G phones, to 8K TVs, to self-driving cars, but one technology showed up the most: foldables. Dell, Huawei, and HP all showed off new foldable phones, laptops, and tablets. So if this is the future of consumer electronics, why hasn’t Apple made any foldable yet? Have they lost their edge over their competitors? Have they fallen behind the pack? The answer to both of these questions: No. The reason why Apple hasn’t made any folding iPhones, iPads, or MacBooks when every other tech company has products using this technology yet is not because every other tech company is smarter than them, but because they are smarter than every other tech company. Right now, folding phones suck. Apple knows this. They saw what happened with the Galaxy Fold, how the review units broke within days of journalists receiving them, everyone did. Apple knows that folding technology isn’t ready yet, and they know that, if they want to make a truly great folding product, they need it to be. Apple has done this before, when they were developing the original iMac, Steve Jobs refused to use a tray loading disk drive, even though cd-burning drives didn’t come in the slot loading form factor yet. Jobs knew it was better to have a better user experience, one where users didn’t have to press a button to insert and play disks, than give them the functionality of being able to burn cds. This core appreciation for the user experience to the extent of the loss of some functionality is the core foundation of Apple today, what sets it apart from Dell, HP, Lenovo, and the rest of the pack, and the policy that has made them the most popular-and most profitable-tech company in the world.

What is the future of the Mac?

The Original Mac
Image via AppleInsider

The year is 1984, IBM has a tight grasp over the personal computer market, and many think it will stay that way. But then, Apple, a fledgling company with a small but loyal group of users, introduces a new personal computer, one that seeks to dethrone Big Blue as the king of the PC, and its called the Mac. Small, well designed, and friendly looking, the original Mac didn’t look like anything else in its league, which was filled with ugly beige boxes and towers. While it wasn’t their first computer, the Mac put Apple on the map, and was there their main product line for a majority of their time as a company. The iMac, the iPod, even the iPhone, iPad, and Apple Watch, none of these would be possible without the first Mac. But now, 35 Years after Steve Jobs walked onstage and unveiled the computer that would change the world, the purpose and the future of the Mac is as unclear as ever. A product line that used to carry the weight of the company is now relegated to a side thought. Sure, Apple has paid more attention to the Mac when it comes to pro products, but their focus on consumer computers is set on the iPad. Furthermore, having two lines of personal computers is pretty confusing for customers to navigate through, making it more difficult for users to decide between an iPad, arguably the better personal computer and the more forward thinking product, and the MacBook, the faulty keyboard adorned, weaker product of the two. To me, it seems that the future of the Mac is that of a professional device, for programmers, filmmakers, and other creative professionals, while the iPad serves the purpose of the consumer computer. This would fix Apple’s confusion problem, while giving them a strong foot in both the consumer and professional markets, allowing them to reach the widest range of customers while retaining the amazing user experience that came with the original Mac, way back in 1984.