2010: Office Wars

Media_httplh5ggphtcom_ikhie

Microsoft Office has been THE standard for document creation in the English speaking business world for at least a decade. The components of Word, Excel, PowerPoint, and Outlook, when paired with an Exchange Server has been the de facto standard in business. Word can manage simple text documents, complete academic papers, and even simple desktop publishing tasks. Excel spreadsheets are used in everything from accounting to basic databases, to mini-programs when combined with macros. And, who hasn’t heard of “Death by PowerPoint”? Outlook is the preeminent e-mail/calendar/groupware solution that hasn’t had any real competition outside the ever present but lagging Lotus Notes. Microsoft has owned the business content creation market since the early 90s, with no realistic competition. There is a vested interest in Microsoft protecting their profit center around Office, and they won’t go down without a fight…

In 2010, the game is finally going to change. 2009 has been marked with trends around the mobility of business users, and the increasing globalization of companies’ workforces. Thus, the word “cloud” has snuck into IT vocabularies across the nation. In order to service mobile users, data and documents need to be accessible anywhere. Various startups (Zoho, Writely (acquired by Google), and others) have been touting this concept for years, under “collaborative document editing”. Realistically, all this means is that the latest version of the document is stored in the “cloud” and multiple users can access it and edit it either online, or offline (and upload the changes later).

The battle will unfold with new innovations from Microsoft and Google. Microsoft’s Office 2010 is introducing a number of significant changes to their business model, as follows:

  1. Office 2010 will have a free version – Until now, the cheapest version was Home & Student, which could occasionally be found for $69.99 on special
  2. Office 2010 will save documents to the cloud – Microsoft will support this natively, offering free, but limited storage for home and business use
  3. Office 2010 will focus on interoperability – The EU has been forcing Microsoft to open up their formats and natively support ODF, and Microsoft voluntarily incorporates support for PDF

Google, the upstart competitor in this battle, has been adding features to their Apps offering to really compete in this space. Here’s their approach:

  1. Google Apps will not match Microsoft from a feature to feature perspective – Google Apps just needs to be “good enough” (and cheaper) to switch
  2. Get Major Institutions to switch to Google Apps
  3. Build an Ecosystem around Google Apps – Offline through Chrome and Extensions through Labs
  4. Emphasize Search – Google started as a search company, and searching all your documents from anywhere, anytime will be a sizable draw…

So, where does this lead? Microsoft Office is the obvious incumbent, with great new features coming. Google Apps is getting stronger everyday, but is clearly lacking (and will continue to lack features). Either way, this will be a great year for document creation, making it easier for consumers and businesses to share and collaborate on documents. I, for one, am excited to see the results…

Relevant Links:

Advertisements

Solid-State Storage: What’s the hold up?

First, some links for background as to what’s going on in the industry:

Solid-state technology is far superior to traditional hard drives in random access time, sequential and random writes and reads, and vibration resistance. As of today, there are still two major factors that are holding solid-state back from widespread adoption, cost and reliability.

Hard Disk Drives (HDDs) have been the mainstays of computer-based long-term storage since the personal computer was invented. They have followed a similar path to Moore’s Law on the CPU side, where roughly every 18-24 months, hard drive capacity would double, which would in turn cause prices to collapse. This was driven by an insatiable demand by consumers and enterprises alike to be able to store more and more data. But how long is that sustainable? When do customers start to say “I don’t need any more storage”? Now, I fully realize that Bill Gates also said “640k ought to be enough for anybody” back in 1981, and we see the obvious fallacy there. However, I think the main point is not how much storage each individual person needs, but where do they need it?

I believe that consumers and enterprises will both continue to need increased storage (storing endless amounts of pictures, videos, music, etc) but I think that users will start to migrate towards a more “online” approach.

Thus, there is an intersection for solid-state drives where consumers and enterprises have “enough” space for short-term storage, and then they will look to online, or archival (probably through very large hard drives) methods for long-term storage. We haven’t hit the point where SSD technology is cheap enough for this to really make sense.

On the consumer front, I believe that most users will be able to get by on a 128GB SSD. When this drive hits $150, it will gain widespread (>20%) market share within two quarters.

On the enterprise front, I think reliability is still the key issue, as cost to performance ratio is already at parity with equivalent SAS/FC solutions. I recently visited Oracle OpenWorld up in San Francisco, and every major vendor had an offering utilizing solid-state technology. At this point, it’s still new technology, wear-leveling, crash tolerance and a method of measuring risk are still brand new.

2010 will be the true rise of solid-state in the industry, with 2011 seeing most desktop, notebook and enterprise systems incorporating solid-state in some way. The hard drive manufacturers already know this, note WD’s purchase of SiliconSystems and Seagate’s plans around Solid-State.

Disclaimer: I already have a 128GB Crucial SSD in my Windows 7 notebook and it SCREAMS

Who gets the credit? – Mobile Navigation

Media_httplh4ggphtcom_jpcot

Earlier last week, Google made an announcement that rocked the GPS market (namely Garmin and TomTom). The odd thing about this, is that one announcement from Google gets all the press, the web, and the tech market in general, talking about personal navigation and how this is a “game-changing technology.”

In fact, Google’s announcement really isn’t anything revolutionary. GPS technology has been present in mobile phones for over two years, and the introduction of Google Maps goes back to 2005 (and MapQuest before that started in 1996). The fact that investors either didn’t see this coming, or weren’t taking it into account on the stock price is amazing to me. Turn-by-turn navigation technology is already provided today, streaming to any Windows Mobile based phone with GPS technology built in. The only difference is that Microsoft isn’t marketing it very well. Microsoft has taken a back seat this year when it comes to marketing, allowing the products to speak for themselves. Bing for mobile is, although not as polished as the upcoming Google offering, a great tool for finding information on the road. It has built in voice recognition, turn-by-turn directions, satellite view, etc.

In essence, Bing mobile has today EVERY major feature mentioned in the Google announcement, yet the product has made almost no noise in the industry. Microsoft needs to get back to letting the industry know that they are offering leading technology products, and continue to innovate.

That said, Bing mobile still has some room for improvement. Here’s 3 things that would help this product succeed:

  • Higher Resolution Maps – Google Maps looks great on high-resolution, high-dpi devices. I have a new 480×800 screen, and the Bing maps look like low-res images that are oversized. The advent of 3G data bandwidth and high resolution screens means it’s time for an update.
  • Updated UI – The opening screen for Bing mobile includes options including “Maps”, “Traffic”, “Directions”, “Gas Prices”, etc. To me, this seems redundant, all this information will be displayed on a map, so open to the map and then let me add the info I want on top of it. Quick buttons on the map interface would be much better than the plain white background.
  • Marketing – What good is the product if no one knows about it? Learn to enlist social media and new media outlets to get mainstream attention.

Windows 7 Launch Day

Media_httplh6ggphtcom_erjob

October 22nd, 2009 is the official retail launch of Windows 7 for Microsoft. It’s been a long time coming, and many have said that “Windows 7 is what Windows Vista should have been”. I definitely agree with this statement, and I wholeheartedly advocate that Windows users migrate to Windows 7 as soon as possible.

But, let’s look at WHY to upgrade. First off, one thing I hated about Windows Vista was that my computer always seemed to be “doing something” even if I wasn’t running any applications… CPU usage was all over the place and memory usage would keep growing. Wih Windows 7, that problem is solved, it FEELS like a lightweight operating system. But aside from the generalities, what specifically can Windows 7 do that Windows XP can’t?

  • Native 64-bit CPU, memory, and application support
    • With memory being so cheap, and 64-bit applications around the corner (Office 2010, Photoshop, even IE & Firefox) it’s important to be ready for the future (and you CAN’T do an in-place upgrade from 32-bit to 64-bit O/S)
  • Great new version of Windows Media Center (including new Internet TV features)
  • Updated, modern GPU-accelerated interface
    • New taskbar to keep all your most used programs close at hand, add jump lists and it’s easy to get to the documents you need in less than 2 clicks
  • Startup in less than 30 seconds, Shutdown in less than 10 seconds, Resume almost instantaneously
    • Instant gratification, what more can you want?
  • HomeGroup functionality (like a mini-domain)

    • Easily Share documents, media and printers with a simple password

Those are the big new functions that I personally use everyday. I have a total of 3 computers in the house, all of which are running Windows 7 Ultimate 64-bit. It’s simply better than Windows XP in every regard, and is well worth the upgrade. Always do a clean install when upgrading to ensure your computer is performing it’s best.

Here’s the relevant links to help purchase and deploy Windows 7 Professional:

Disclaimer: I’m the Business Development Manager for Microsoft at ASI, so I am a little biased

The Digitization of Knowledge

I recently rediscovered the joy of the public library system. Back when I was a kid, I used to devour books by the shelf, reading over 100 books a year. I would tear through everything from cheap Star Wars novels to non-fiction books on political theory. Needless to say, it wasn’t long before my parents stopped buying me new books (which I would really read only once anyway) and turned me over to the public library system, where a local branch was only a few blocks away. These were the times when I could take out up to 10 books at a time for zero cost to me (except for late fees of course). Knowledge was accessible to anyone, and you could take just about any book home with you if you had a library card.

Today, America’s libraries are much the same as when I was a kid, and they provide even more access to knowledge, thanks to the power of computers and the internet. Yet somehow, the same unbridled access to any information or knowledge at no cost seems to have become a thing of the past. The American Library Association seems to strongly advocate “Intellectual Freedom” which they define as “the right of every individual to both seek and receive information from all points of view without restriction.” However, something seems to be getting lost in the transition from the dusty stacks in the physical library to today’s digital revolution. This premise of access to information as a basic human right seems to be forgotten in today’s lawbooks.

The thought of Intellectual Property is definitely a tough problem to solve as we move into an era where the American economy is more service/knowledge-based as opposed to a traditional manufacturing society. Intellectual Property, at its base, is really no more than an idea or thought or performance, that has been transcribed onto some sort of digital medium. No one will oppose that the creators of these thoughts, ideas and methods deserve some sort of recognition and ability to recoup the effort that went into that creation. However, two major factors have changed since our original laws around patents and intellectual property were drafted.

First, when computers first debuted back after World War II, a new medium of software was created. I remember my dad used to tell a story of working at Rockwell when they were designing the space shuttle, and the rocket scientists needed to know how much the software physically “weighed” since they had to account for every weight in their launch calculations. Of course, software has no literal weight, which had to be explained over and over, but the principle is an important one to notice. If something can be copied with almost no effort at all, with no physical loss to another, all that has been transferred is simply an idea. Isn’t an idea merely information?

Second, the rise of the internet now allows access to information that isn’t just locally based, but can literally be anywhere in the world and is mere milliseconds away. Hundreds of individuals, groups, and businesses publish information on internet for free, all sharing that information. I’d say the vast majority of pages are free, and the for-pay sites are in the relative minority. Of course, we have numerous corporations trying to monetize large amounts of information right now, let’s take Rupert Murdoch and Google as an example. Are they selling the information, the service of accessing it, or both? The opportunity cost of this information should be next to nothing. Data warehousing is cheap, especially when hard drives allow costs of <$0.10/GB.

I’m all for free markets, so let’s keep it free and we’ll see how it plays out. Giving access or information to be in the hands of any single party (such as the Google Book Settlement) is dangerous, and the best method to work against this is to distribute the load across many players. I don’t think that we should simply give all the rights to one clearinghouse for all the information and rights, providing a free market for the information (including “orphan works”) is the only way to secure competition and free access to the information. When other players (Amazon, or some other non-profit) scans the text, let’s let them regulate how much they want to charge and who gets paid for it.