Friday, January 25, was the 35th anniversary of the Apple Macintosh computer. It’s hard to believe it’s been that long. I was in college at the time, and one of my professors bought a Mac and brought it to class. We were all amazed at what a graphical user interface looked like and at what this little computer could do. It was so different from the Apple II computers we were using then. I know it’s an over-used term, but it seemed like magic at the time. I was enthralled and wanted one so badly, but didn’t have the money. I had had “the bug” for computers since a high school trip to the local college, where I got to play around on a mainframe. Seeing the Mac and how different it was cemented my interest in computers. It would be many, many years before I was able to own my own Mac, but these early experiences led to my career in the software industry.
Looking back over the past 35 years of computing history, I’m amazed at how much was accomplished on, what by today’s standards, was very simple and low power machines. Not only did those early Macs help usher in a new era of computing, with new ways to do desktop publishing, project management, financial analysis, and more, people found ways to use these new computers in ways their creators never imagined. They learned new ways to create things and new ways to work. This is not unlike today’s world, where new types of computing devices are being used in many ways to create and communicate, often in ways not originally designed into the device.
As I’m writing this blog post on my iPad, I can’t help but compare the experience of working on the original Mac and working on this iPad. I remember what people said and wrote about them back then. There are many parallels between the perceptions of those original Macs and today’s iPads: Mac vs. PC, GUI vs command line, ease of use vs power, mouse vs keyboard, iPad vs laptop, real work vs toy, and so on. Maybe these types of arguments are inevitable when something new threatens to replace or improve upon the incumbent. Some are eager to embrace the new, even if it’s means a learning curve, while others are reluctant and often are the ones to point out all the ways the new technology will never be as good.
One thing that jumps out at me in this comparison is the size of the original Mac’s screen: 9” with a resolution of 512 x 342 pixels. Such a small screen did not stop people from doing great work. In today’s world, many would say a small screen cannot be used to get real work done. But for me, a small screen provides focus. The notion that a giant monitor is required to do anything of importance seems wrong to me. The 11” iPad Pro that I’m working on has an effective resolution of 1194 x 834 (the actual resolution is 2388 x 1668 since the retina screens use twice the pixels for a clearer image). That’s more than twice the screen space as the original Mac, and large enough to handle almost anything.
When I was still working, I eventually bought into the idea that larger and more screens would improve productivity. I had always preferred working on laptops, as I loved being able to always have my work with me. I didn’t find the smaller screens to inhibit my ability to get things done. But, many smart people around around me convinced me that more was better. And indeed, I saw many people with multiple large monitors who seemed so much happier and who said how much more productive they were. So, I jumped in! First by setting up a desktop PC (I had only been using a laptop) and connecting a large monitor. It was OK, and I’m sure the newness of it made me feel as if I was more productive. Then, I connected another monitor. Then I replaced those with larger monitors. At one point I had three large monitors connected to my PC all in the pursuit of better productivity. I soon realized though, that for me, this did nothing for my productivity. It had the opposite effect. I felt overwhelmed by information. I felt surrounded. I like to stay focused on the task at hand, and having apps and windows spread across multiple screens just distracted me. I ended up switching back to a single monitor and was much more productive.
Similarly, after I retired and starting doing app development, I did it all from a 15” MacBook Pro. I had tech friends who were shocked that I didn’t use a monitor, or several, attached to my laptop. But I found that having a single, big enough screen was perfect for me. I could focus and easily switch to other windows or apps as needed.
Now that I’ve switched to the iPad as my main computer, I’m more than happy with the screen size and really, the entire computing experience. I plan to write more about the switch in the future. I plan to over what took time to re-learn, what still seems harder, what’s better, and how I do my work.
I was going to write more today about the iPad vs laptop debates that have been going on, especially since Apple started pushing harder on the notion that the iPad is laptop replacement. But the whole thing seems a bit silly to me and you can find plenty of articles online if you desire. We each have different uses and needs for our technology, and different comfort levels with learning new ways of doing things. I say: use the technology that works best for you. But, at the same time, be cognizant that others may have different needs and different levels of comfort with technology. If you’re a techie and often asked for your opinion, be sure to take that into account. Sometimes it’s hard to separate what we need or want from what others need or want. But we should.
It’s fun to walk down memory lane and think about how the world has changed. To me, though, it’s even more fun to think about the future, and where we’ll take the world!