Category: Cloud Computing

  • Speaking of healthcare data, is Microsoft the elephant in the room?

    In a previous blog I discussed the need for a uniformed data structure in healthcare. The concept got me thinking about how to accomplish such a monumental task, and make no mistake, it would be a monumental task. There aren’t many “people” out there that could develop the hardware and software infrastructure solid enough to handle the needs of the complex data stream coming out of the healthcare industry.

    Then I noticed a trend at a lot of the web sites that I frequent: Microsoft has slowly, and quietly, been positioning itself to jump into the healthcare market.
    (more…)

  • What we need is a system-neutral data structure for healthcare

    During a web browsing session the other day I came across a very interesting blog post by Louis Gray titled “The Future: Operating System And Application-Neutral Data”. I enjoy reading Louis’ posts because I think he has a great vision for the future of personal computing, data, and “the cloud”

    The blog speaks specifically to the ownership of personal data versus allowing companies to sit on it and possibly hold it hostage secondary to a lack of compatibility with other systems. The information you throw onto the internet defines who and what you are, more now than ever before, and you need to be able to move it around anytime from anywhere.
    (more…)

  • Time for a new model of data storage and software distribution in pharmacy

    There was a time when I thought all a pharmacist needed to do his job was a pen and a calculator. It was just so cumbersome to carry anything else. If you wanted to have mobile drug information it meant carrying a drug reference book with you everywhere. Who can forget being in pharmacy school where every self respecting pharmacy student had a Drug Information Handbook stuffed in their lab coat pocket along with all the other stuff they carried like a homemade peripheral brain scribbled on the pages of a notebook or on those neat little 3×5 cards.
    (more…)

  • SaaS and pharmacy

    Software as a service (SaaS) has recently been popping up in healthcare related news, from Fujitsu’s SaaS solution for drug trials to the host of web-based applications from Pharmacy OnceSource.

    SaaS is different than the traditional enterprise software model because the provider of the software licenses it to the customer as an on-demand service. The vendor often times hosts the software on their own servers where data is manipulated and returned to the customer for viewing. It’s kind of like renting software.

    The beauty of SaaS applications like those from Pharmacy OneSource are that they can be viewed from any device with a web-enabled browser; Mac, PC, smartphone, etc. In addition, the application is owned, delivered, maintained and managed by the provider, limiting the burden on the customer. A by-product of this model is that delivery of the application over the web ensures that the software is always up to date.

    The SaaS model appears to be popular in the “business” world at present, but is increasing in popularity in healthcare secondary to its simplified deployment and reduced cost. With advances in cloud computing strategy, better data storage models and faster internet connections I think it’s only a matter of time before we start to see more SaaS solutions in pharmacy practice. And why shouldn’t we? By their very nature SaaS applications lend themselves to use on mobile devices like the tablet PC and iPad, which in turn offers greater flexibility for pharmacists practicing at the bedside. Just a thought.

  • The “cloud” gets a black eye

    InformationWeek: “Think of the one million T-Mobile Sidekick customers that may have lost important data last week. Think of the dozens of CIOs that anxiously waited for Workday to restore its SaaS service on Sept 24. Cloud computing has created a new era of accountability, and we must demand that tech vendors work harder than ever to prove their trustworthiness. In both of these instances, customers were completely dependent on their vendors to manage their data. And in both instances, technical failures are to blame. The growth of cloud computing is not going to let up—we’re not going to suddenly start moving away from the Internet and speedy networks and store more data on our home PCs and company servers—so it’s time that everyone, from consumers up to CIOs at the world’s biggest companies, start asking questions and demanding accountability from their vendors.” – Cloud computing has been taking a beating in the press lately. Everywhere I turn someone on the internet is talking about the Sidekick fiasco, and I have to agree that permanently losing your customers data is inexcusable. However, this type of failure happens in the “non-cloud” environment as well, you just don’t hear about it. Last year our facility had an email server fail. Some, but not all, data was lost and we were without email services for nearly two weeks. It was the most productive time of my life. The cloud model is relatively immature at this point in time and will suffer failures and setbacks as it continues to develop. Hopefully the Sidekick failure has provided us with a valuable lesson that will be used to further improve the cloud. Only time will tell.

  • Musings on the “cloud”

    cloud_cartoonI tend to read a lot about cloud computing in my spare time. It’s an interesting topic and there’s no shortage of reading material as it is a very hot topic in many circles. I still find it strange that the definition of cloud computing continues to expand at a time when it should be contracting. I’m a firm believer that the technology is available, but vendors are hesitant to take advantage of it for various reasons; cost, fear of change, security, etc. Anyway, here are some of things that crossed my path over the past several days that I think fall into the “cloud” category.
    (more…)

  • Sabotaging an idea – hybrid clouds?

    InformationWeek: “What if, instead, applications throughout the data center could run at closer to 90% utilization, with the workload spikes sent to cloud service providers (a process called “cloudbursting”)? What if 85% of data center space and capital expenses could be recouped, with a small portion of that savings allocated for the expense of sending those bursts of computing to the public cloud? This tantalizing possibility–enterprise IT organizations managing an internal cloud that meshes seamlessly with a public cloud, which charges on a pay-as-you-go basis–embodies the promise of the amorphous term cloud computing. Step one with virtualization has been server consolidation. The much bigger benefit will come with the ability to move workloads on and off premises. “Anyone can build a private cloud,” says Rejesh Ramchandani, a senior manager of cloud computing at Sun Microsystems. “The gain comes if you can leverage the hybrid model.”” – So much for the purity of the cloud. I’ve read several articles lately that refer to “hybrid” or “private” clouds. Crud, my hard drive at home is a “private cloud”. I can partition it, virtualize it, and grant other users access to it. The very idea of a dynamically scalable and virtualized service over the internet disappears quickly when you begin to tie these services to local infrastructure. Having data reside locally for a short period of time to improve retrieval makes sense, but that information should eventually move to the cloud where it stays until needed again. The article above goes on to talk about the lack of standardization in the development of the cloud model. It sounds like everyone is headed in a different direction. I really hope the trend doesn’t continue as I think carving the cloud up into different models to suit your needs will only dilute a really good idea. Creating hybrid and private clouds will ultimately lead to another group of segregated services and a complete waste of the theoretical advantages of the cloud.

  • US Army utilizing Software-as-a-Service (SaaS) to standardize methodology

    InformationWeek: “The pilot program has already shown the way toward more consistency in environmental reporting and given the Army baseline data for forecasting, but Davis is looking for more, especially in the way of greenhouse gasses. “As we begin to understand and appreciate the benefits of this information technology, we can calculate our greenhouse gas emissions and ultimately our carbon boot print,” he said. “I want something we can audit later on, something that’s not just a back of an envelope calculation.” The Army’s choice of multi-tenant SaaS for its pilot is notable, as the military has been reluctant to use Web-based systems in other cases, especially with operational data such as on-installation emissions. “The reaction of some of our customers is, ‘Oh my gosh, on the Internet?’ ” John Garing, director of strategic planning at the Defense Information Systems Agency, said in an interview earlier this year.” – You know you’re behind the times when the US military is outpacing you in non-weapons related technology. Doh!

  • Moving storage around in the “cloud”

    black_cloudByteandSwitch:” One of the great theoretical advantages of cloud computing is the implied portability – users can move data in and among cloud resources easily, and the cloud itself may move data between and among resources without the customer being aware that anything has changed. In practice, cloud data can prove just as firmly rooted in physical location as any “traditional” data resource – but that could be changing with the rise of applications like NetApp’s new Data ONTAP 8 cloud storage system.” – The article goes on to say that the Data Motion system “allows data mobility with no downtime required for storage-subsystem expansion or scheduled maintenance,” That’s a nice thing to have as the ability to shuffle data around without affecting end users is important, but don’t you think it’s a little weird to talk about moving data from one cloud to another. I thought the whole point of the cloud environment was to eliminate the need for things like this. Anyone?

  • Hybrid cloud to speed things up?

    ByteandSwitch: “Every week or so one major internet service or another goes down for a moment, Amazon S3, Google Apps, Twitter etc… Let’s face it, if you store data in the cloud there are a hundred variables between you and your data and if any one of those variables decides to, well, be variable, then you may not be able to get to your data for a period of time. This does not mean that you can’t use the cloud, it means that you can’t put data that you are going to need immediate access to solely in the cloud.  What this does mean is using a hybrid model for cloud storage. As we demonstrate in our latest video “What is Hybrid Cloud Storage?” a hybrid cloud is an appliance that is placed on the customer’s site to act as a intermediary storage location for data that is in route to the cloud. The appliance serves many purposes: translation from CIFS/NFS to more internet friendly protocols, local cache for rapid restores of last copy of a backup or archive and as a place to get to data that would otherwise be inaccessible due to some sort of connection issue.”

    Hybrid_cloud

    – This actually makes perfect sense to me. One issue that often comes up when discussing a cloud environment, besides access to data, is speed. We have started using thin clients here at the hospital in place of desktop machines, and there is little doubt that performance has suffered. With the option discussed in the article above, data would move quickly between you and the local environment while in use, but slowly moved into the cloud in the background. I like it.