Related blog: Coming Soon to Your Doctor’s Examining Room by William Moore, CTO of CareCore National
“It’s a boy!!!” my friend Kim told me just minutes after her 18 week ultrasound. Even though we were texting I could tell her excitement was restrained despite the exclamation points. Later that day she shared “he’s healthy but…[big inhale]…he has a cleft lip [even bigger exhale]”
This unexpected information meant more tests for her and her unborn son, Mason. It meant a series of surgeries starting at 6 months until age 5. It brought a lot of anxiety to Kim’s entire family.
In addition, the diagnosis raised a lot of questions such as, “Will Mason be okay? How will my family support him and cope with our baby having surgery? Will my insurance cover all that is needed to treat his cleft lip? Will his treatment be personalized? Will I…will he…be subjected to unnecessary tests? Will there be a lot of tests? Can I trust that his healthcare team is up to date on all the latest treatments? Will there be a team of healthcare experts to support us as Mason recovers from each surgery?”
Kim had a lot to prepare for and wanted to feel confident about Mason’s healthcare team. She wanted to know that the most experienced doctors would provide the best care possible based on leading industry practices. What she wanted most was peace of mind that her son would be ok.
Improving the outcomes of patients like Mason while simultaneously alleviating the burden on physicians is no easy task. It takes a bold and innovative company to tackle such a challenge, one who is at the forefront of the healthcare industry and can envision improved care, better outcomes, and healthier people.
CareCore National is such a company. The company currently has contracts with more than 25 health plans working with 600,000 physicians providing care to 68.8 million people.
Read More »
Tags: CareCore, Cisco Data Center, Cisco Services, cloud, EMC, healthcare, jill shaul, VMware
Many Big Data related innovations have been developed by Web 2.0 companies, resulting in a growing collection of open source technologies that dramatically change the culture of collaborative software development and the scale and economics of hardware infrastructure. These technologies enable data storage, management and analysis in ways that were not possible before with traditional technologies such as relational database management systems, in a cost-effective manner.
NoSQL is one such technology that has emerged as an increasingly important part of big data trends for applications that demand large volumes of simple reads and updates against very large datasets (Hadoop is the other innovation, a generic processing framework designed to execute “read only” queries and batch jobs against massive datasets). NoSQL is often characterized by what it is not, and definitions vary. It can be Not Only SQL-based or simply Not a SQL-based relational database management system. NoSQL databases form a broad class of non-relational database management systems that are evolving rapidly, and several solutions are emerging with highly variable feature sets and few standards.
While these technologies are attractive from the standpoint of the innovations they can bring, not all products meet enterprise requirements. Many organizations require robust, commercially supported solutions for rapid deployments and the ability to integrate such solutions in to existing enterprise applications infrastructure.
To address these needs, Cisco and Oracle are the first vendors collaborating to deliver enterprise-class NoSQL solutions. Exceptional performance, scalability, availability and manageability are made possible by the combination of the Cisco Unified Computing System (UCS) and Oracle NoSQL Database. Together, this powerful solution provides a platform for the quick deployment along with predictable throughput and latency for most demanding applications.
Read More »
Tags: Big Data, NoSQL, Oracle NoSQL Database
We invited William Moore, CTO at CareCore National to share his thoughts on how cloud and big data are impacting the healthcare industry. Read related blog, “It’s a Boy!”
Now that the initial frenzy of the cloud revolution is settling, solid applications are providing a glimpse of the potential of cloud computing to change daily life for the better. In my industry, healthcare, the cloud is not simply transforming existing processes, but actually enabling new decision-making models that simply weren’t possible before.
Why Electronic Medical Records Fell Short
The healthcare industry earlier tried for transformation with electronic medical records (EMRs). The original notion was that individual physician practices could justify the investment in servers, software, and maintenance based on efficiency gains. Then we’d bubble up the health records data from multiple organizations and it would be a Shangri La moment for chronic disease models, coordinated care, care duplication, and more.
But reality fell short of the mark. Many physicians’ offices are really small business at heart. They were hard pressed to afford EMR infrastructure and all that went with it. Efficiency gains are minuscule at best if you simply print out patient charts each morning, place them on that same old clipboard, mark them up with a ballpoint pen, and then have the office manager enter the new information into the EMR system to print out next time.
Without a critical mass of EMR infrastructure, developers lacked the incentive to create standards and unifying protocols. And the lack of protocols prevented meaningful sharing of data.
Even if some of your healthcare providers do use EMRs, it’s rare that all of your providers can see yours. Connecting EMRs among more than a handful of physician practices is not technically feasible, nor is it appropriate.
Read More »
Tags: CareCore, electronic medical records, emr, healthcare
After, chatting with other cohorts in the industry (namely Greg Ferro aka Ethereal Mind and of PacketPushers fame, Stephen Foskett of Tech Field Day fame and also writer of his own blog, and Ivan Pepelnjak who, amongst other things, writes an excellent blog on networking topics), its clear that trying to stay up to speed with all the various happenings in going on in the data center is pretty near impossible because of the sheer volume of information and the signal-to-noise radio that sometimes gets out of hand. While all of us, in our own little ways are doing our piece to help address this through blogging, white papers, seminars and the like, we wanted to try something a little different to really dig into some of the more complex topics and came up with the concept of Virtual Symposiums:
- Focused discussion on a single topic
- Panel discussion with industry experts (or at least folks that believe themselves to be experts)
- Focus on helping attendees understand the “how” and “why” of different technologies as opposed to advocating a particular perspective
- Interactive, open to audience interaction–the goal is to get your questions answered
Our first symposium, next Tuesday, is going to cover storage convergence. Joining our panel for this discussion, we are lucky to have J Metz (@drjmetz) joining us to lend his FCoE and storage expertise. We will also get joined by special guest Stu Miniman (@stu), Principal Analyst from Wikibon who also brings perspectives shaped by over a decade in the storage market. We are going to spend approxiamtely the first half of the symposium discussing the storage options out their: FC, FCoE, and iSCSI and when we think each one makes sense (or not). As I noted above, the goal is to not push a particular technology agenda, but to educate you, and let you make your own decisions. The balance of the session is open for Q&A–we will cove some of the common questions that we see all the time, but we expect you, the audience, to drive a lot of the discussion.
So, mark you calendar and join us–if you are familiar with this crew, you know it will be both educational and entertaining.
This particular session will be delivered via WebEx on Tuesday, March 27 from 9am-12pm Pacific Time. The link to participate is
https://events-cisco.webex.com/events-cisco/onstage/g.php?t=a&d=209024880 and the password is “cisco”. There is no pre-registration necessary.
For subsequent symposia, we are looking other meaty topics like VM networking, data center interconnect and we are not done with storage yet.
Here is the replay link for folks that miss the live session:
Tags: data center
Some people say that in the next few years that Infrastructure as a Service cloud deployments will be focused mostly on private clouds. And then they say that enterprises will migrate to public clouds after they have become “experienced” in running a cloud. About a year ago I could really see this story played out. Now, fifteen months after we introduced Cisco Intelligent Automation for Cloud, I have some different points of view. I would have thought that by now that private cloud architectures would have begun to converge to a few standard patterns. This has not happened. The world is still diverging when it comes to both Private and Public cloud architectures.
I do see patterns arising in successful cloud deployments and here are some of the key ones:
#5: Pragmatic Approach: IT shops that come with a long list of RFP requirements and questions take a long time to source a technology provider and to achieve production success. Others that are pragmatic (can I say Agile in their approach) get to cloud quicker and learn from their successes and missteps alike.
#4: They Have a Cloud Instance Roadmap: After a cloud deployment, some IT organizations think that is it, they are done, next project, my move to cloud is complete. Hold it right there, did you know that cloud is not a single step where you through a switch, but a succession of deployments of great scope from one step to the next? A roadmap is needed that covers: hardware, network, storage infrastructure, virtualization technology and release version, management and orchestration software instance version and finally the services that you are offering to the end users and how the service catalog is changing over time. Those that have a roadmap roughed out are generally more successful than those that have a big bang perspective.
#3: Appreciation for Challenge of Management of Change: Moving to cloud is a big change in an operating model; careers are created and new roles are defined. How does an organization move to the new model with different technology, processes and people? When a team proactively manages the change in the non-technical they ensure long term success. It is not just about self service, cloud catalogs, orchestration, domain management and virtualization. It is more about service designers and automation authors and changes in operational processes.
#2: Rise of the Cloud Architect: Since cloud is about a new operating model a new position and role is needed. If you have a cloud project and do not have a cloud architect tying it all together from cost models, to hypervisors, to orchestration and orderable service definitions, you need a organization role tune up ASAP.
#1: A Service Centric Approach: Most people get this one right away. Service centric projects are the key focus for ITaaS. However, I can’t tell you how many times when I am talking to an IT team, the opening bell results in a speeds and feeds conversation around provisioning that piece of infrastructure and that virtualization API. If you ask the question about what services they want to offer their end users for self service ordering you will get a request for more time to answer that question. Service Centric IT shops will take the time to start first with the business requirements and the perspective from the end user point of view. Transform your cloud project approach to a service centric agile project and you will go far.
Tags: Cisco Intelligent Automation for Cloud, IaaS, intelligent automation, ITaaS, orchestration, private cloud, Public Cloud