4 Reasons Data Driven Small Businesses Are Embracing Big Data

Small businesses are thinking bigger about their data – and it’s about time.

The term big data sounds intimidating – reserved only for the Fortune 500 leaders – but that could not be further from the reality of data analytics in the competitive small business market today.

Previously the exclusive domain of statisticians, large corporations and information technology departments, the emerging availability of data and analytics – call it a new democratization – gives small businesses and consumers greater access to cost-effective, sophisticated, data-powered tools and analytical systems.

For small businesses, big data will deliver meaningful insights on markets, competition and bottom-line business results for small businesses.

For small businesses and consumers, the big data revolution promises a wide range of benefits.

New Tech, New Rules

Today, big data is changing the rules of commerce and business operations, creating opportunities and challenges for small businesses. The convergence of three leading computing trends – cloud technologies, mobile technologies and social media – are creating cost-effective, data-rich platforms on which to build new businesses and drive economic growth for small and large businesses alike. This helps boost local economies as well as global e-commerce and trade.

Optimizing Insights

Digital data will continue to turbocharge the movement to understand analytics, in both small and large businesses. Proprietary data combined with data from the cloud will continue to create new insights and a deeper understanding of what consumers need, what they like and what will keep them happy.

The development of new data sources and unique analytics will drive entrepreneurial growth around the globe over the coming decade.

Better Management

Today, small businesses can leverage business management solutions, including Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) software platforms, to automate operational management tasks and keep better watch over their very own big data – including analytical views of sales and marketing campaigns.

Small businesses can stay on top of accounting, cash flow, budgets, balances and more with financial management software alternatives, as well as tools and applications for inventory management, project management, fleet management, human resources and more.

Real-Time Decisions

By optimizing real-time data analytics, small businesses today are capturing a better view of their administrative, sales and marketing practices – including real-time overviews of what’s working well and what needs scrutiny. Small businesses mining their own big data today routinely deploy a variety of solutions – most originating in the cloud – to improve operational and administrative efficiency and productivity, while reducing manual tasks and redundancies.

Today, small businesses are no longer intimidated by big data. They are embracing it to create and manage bigger opportunities for growth and profitability.

Today’s competitive small businesses realize that optimizing analytics and business intelligence allows them to recognize the full benefits of their very own big data – powering better marketing, sales and operational efficiency, productivity and functional gains. With data-driven tasks and decisions in the mix, a new culture of small business is emerging, powering greater opportunities for the small business community, its vendors and customers.

Angela Nadeau  is CEO of CompuData, an award-winning business technologies leader. Angela maintains a deep knowledge of the trends driving businesses today to be more productive and profitable by leveraging technology. With more than 25 years of expertise, she has advised thousands of businesses on effective ways to leverage technology to increase productivity, profitability and efficiency – guiding businesses of all sizes to new levels of market success and corporate growth.

Originally posted via “4 Reasons Data Driven Small Businesses Are Embracing Big Data


Source: 4 Reasons Data Driven Small Businesses Are Embracing Big Data

Apr 13, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)


Issue #15    Web Version
Contact Us: info@analyticsweek.com


I hope this note finds you well. Please excuse the brief interruption in our newsletter. Over past few weeks, we have been doing some A/B testing and mounting our Newsletter on our AI led coach TAO.ai. This newsletter and future versions would be using capability of TAO. As with any AI, it needs some training, so kindly excuse/report the rough edges.

– Team TAO/AnalyticsCLUB


Data security  Source


More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> The What and Where of Big Data: A Data Definition Framework by bobehayes

>> The Cost Of Too Much Data by v1shal

>> Unraveling the Mystery of Big Data by v1shal

Wanna write? Click Here


 How a Data Scientist’s Job ‘Play in Front’ than other BI and Analytic Roles – CIOReview Under  Data Scientist

 AI, Machine Learning to Reach $47 Billion by 2020 – Infosecurity Magazine Under  Machine Learning

 Software to “Encode the Mindset” of Lawyers – Lawfuel (blog) Under  Prescriptive Analytics

More NEWS ? Click Here


Lean Analytics Workshop – Alistair Croll and Ben Yoskovitz


Use data to build a better startup faster in partnership with Geckoboard… more


Storytelling with Data: A Data Visualization Guide for Business Professionals


Storytelling with Data teaches you the fundamentals of data visualization and how to communicate effectively with data. You’ll discover the power of storytelling and the way to make data a pivotal point in your story. Th… more


Analytics Strategy that is Startup Compliant
With right tools, capturing data is easy but not being able to handle data could lead to chaos. One of the most reliable startup strategy for adopting data analytics is TUM or The Ultimate Metric. This is the metric that matters the most to your startup. Some advantages of TUM: It answers the most important business question, it cleans up your goals, it inspires innovation and helps you understand the entire quantified business.


Q:What is cross-validation? How to do it right?
A: It’s a model validation technique for assessing how the results of a statistical analysis will generalize to an independent data set. Mainly used in settings where the goal is prediction and one wants to estimate how accurately a model will perform in practice. The goal of cross-validation is to define a data set to test the model in the training phase (i.e. validation data set) in order to limit problems like overfitting, and get an insight on how the model will generalize to an independent data set.

Examples: leave-one-out cross validation, K-fold cross validation

How to do it right?

the training and validation data sets have to be drawn from the same population
predicting stock prices: trained for a certain 5-year period, it’s unrealistic to treat the subsequent 5-year a draw from the same population
common mistake: for instance the step of choosing the kernel parameters of a SVM should be cross-validated as well
Bias-variance trade-off for k-fold cross validation:

Leave-one-out cross-validation: gives approximately unbiased estimates of the test error since each training set contains almost the entire data set (n?1n?1 observations).

But: we average the outputs of n fitted models, each of which is trained on an almost identical set of observations hence the outputs are highly correlated. Since the variance of a mean of quantities increases when correlation of these quantities increase, the test error estimate from a LOOCV has higher variance than the one obtained with k-fold cross validation

Typically, we choose k=5 or k=10, as these values have been shown empirically to yield test error estimates that suffer neither from excessively high bias nor high variance.


 ASK Club      FIND Project   

Get HIRED  #GetTAO Coach













#GetTAO Coach

  Join @xTAOai  


#BigData @AnalyticsWeek #FutureOfData #Podcast with Eloy Sasot, News Corp

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Eloy Sasot, News Corp

Subscribe to  Youtube


Processed data is information. Processed information is knowledge Processed knowledge is Wisdom. – Ankala V. Subbarao


#BigData @AnalyticsWeek #FutureOfData #Podcast with Joe DeCosmo, @Enova

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Joe DeCosmo, @Enova


iTunes  GooglePlay


140,000 to 190,000. Too few people with deep analytical skills to fill the demand of Big Data jobs in the U.S. by 2018.


AnalyticsClub Demo Video



Invite top local professionals to your office




Data Analytics Hiring Drive


*This Newsletter is hand-curated and autogenerated using #TEAMTAO & TAO, excuse some initial blemishes. As with any AI, it may get worse before it will get relevant, excuse us with your patience & feedback.
Let us know how we could improve the experience using: feedbackform

Copyright © 2016 AnalyticsWeek LLC.

What to Look for in a Healthcare Big Data Analytics Vendor

Healthcare big data analytics is a booming business, which is both a good and a bad thing for providers seeking to bulk up their infrastructure to supplement their EHRs with sophisticated tools for clinical analytics, population health management, and predictive insights.

The number of up-and-coming big data vendors is growing every day as providers recognize the need to treat data as a resource instead of a burden, and picking a winner out of the pack isn’t always easy for healthcare organizations constrained by finances and concerned about developing long-term, effective partnerships.

If you understand your healthcare big data analytics technology options, are preparing to put your team into action, and are ready to move forward with a strategy to harness big data as a way to drive quality improvements and organizational efficiencies, it’s time to dive into the murky world of vendor selection.

HealthITAnalytics.com explores what to look for in a healthcare big data analytics vendor in order to ensure that a provider gets the right technology for its needs in the short term while keeping options open for shifting and changing strategic goals.

Matching what you have to what you want

As specialists trying to participate in the EHR Incentive Programs have learned to their cost, one size doesn’t fit all when it comes to health IT initiatives.  A large, well-known corporation may be able to boast about their brand recognition and have a client list a mile long, but not all healthcare organizations – or big data sets – are created equal.

Healthcare organizations must have a clear idea of what their data sets look like before they can match their needs and goals to a service provider.  Those that have invested heavily in structuring their EHR input may wish to begin their big data programs with general clinical analytics, as many hospitals do.  Others focused more on research, complex cases, or bolstering their clinical decision support might want to turn to companies that offer cognitive computing or natural language processing that can comb through bulky narrative text.

Providers must also examine their existing infrastructure and decide whether they can build upon technologies already in place, or if they would prefer to rip everything out and start again.  Can the vendor accommodate your legacy systems?  Do you need to invest in basic infrastructure like a data warehouse or master patient index in order to benefit from your potential vendor’s wares?  What are the costs involved in bringing your infrastructure up to baseline, and how long will it take to see a satisfactory return on these investments?

The majority of healthcare organizations do not feel fully prepared to tackle these questions at the moment, but that is quickly changing as experience replaces trepidation.  Healthcare big data analytics is a messy business at the best of times, but don’t let an overeager vendor trivialize how much work must be done in order to get the most out of a contract.

A commitment to interoperability and data standards

Vendors must treat interoperability as more than a buzzword these days as federal agencies, consumers, payers, and patients all crack down on data siloes that make big data analytics such a headache.  After Congress raised questions about vendors who actively block the type of information sharing that is vital for care coordination and population health management and the ONC responded with a widely-read report on the matter, vendors have started to change their tune on interoperability.

The rise of interoperability coalitions like Carequality and the CommonWell Health Alliance may make it a little easier for healthcare providers to identify vendors who are committed to health information exchange, but even the combined might of both organizations does not include a majority of the big data analytics companies on the market.

It is up to healthcare providers to ask about the foundations of a vendor’s technologies and how they will interact with other products, providers, and partners.  A few important questions to ask include:

• Is your product built on open standards or proprietary architecture?  Does it accept APIs, and is anyone actively developing them?

• How easy will it be for my organization to participate in large-scale analytics or health information exchange with a state or local entity, my accountable care organization, public health departments, and research organizations?

• How will your product interface with my existing health IT systems?  What sort of user experience can my clinicians and other staff expect?

• Have you considered the growing importance of medical device integration and the Internet of Things?  How will your technology adapt to the need to integrate additional data sources as patient-generated health data becomes more critical to providing quality care?

Transparent business practices and pricing structures

Taking the pledge for interoperability is just one part of having sound business practices that will encourage long-term partnerships.  While the ONC’s data blocking report may have reportedly spooked some vendors into dropping data exchange fees, the question of who has the rights to demand cash for patient data in motion and at rest has sparked some serious debates.

In 2013, the ONC released a guide for providers looking to negotiate EHR replacement contracts, urging them to pay attention to terms that would limit the transfer of patient information to a new system or cut off access to data during a dispute.  The advice about contract negotiation applies equally to an EHR system or a big data technology, each of which can be licensed for use on an organization’s own technology or provided as a service in the cloud.

The ONC warns providers to pay close attention to liability language that may exonerate the vendor from any responsibility should patient harm arise from unexpected downtime, a privacy violation, or an error or omission in the data.  “Developer contract language often includes indemnification language that shifts liability to you without regard to the cause of the problem or whose ‘acts or omissions’ may have given rise to the claim,” the guide says.

“You may want to negotiate with the EHR technology developer a mutual approach to indemnification that makes each party responsible for its own acts and omissions, so that each party is responsible for harm it caused or was in the best position to prevent,” the ONC suggests.

The guide also suggests courses of action for dispute resolution, intellectual property issues, warrantees, and confidentiality agreements.  Most vendors are willing to negotiate these terms to some degree, but be wary of those who insist on an all-or-nothing approach. Before signing on the dotted line, providers should be sure they are clear about their expectations and responsibilities, as well as ensuring they understand the pricing structures for data storage and transfer without falling victim to hidden fees or sudden hikes in a payment plan.

A balance of track record and innovation

Healthcare big data analytics is all about discovering novel and ingenious ways to use information, but providers investing millions of dollars in new infrastructure want to be sure that they aren’t throwing money down the drain.  Despite the general enthusiasm around embracing new ideas for analytics, executive leaders are still a relatively conservative bunch.

This year’s HIMSS Leadership Survey indicated a very high level of board room support for expanding innovative health IT and data analytics capabilities, yet more than a third of organizational leaders would prefer if that innovation had been tested at another organization first.  Just 24 percent of respondents said that their executive leaders were “open to trying ‘bleeding edge’ technology,” which puts big data analytics purchasers in a quandary.  After all, someone has to be the first one to try something new – and to possible reap the rewards of being adventurous.

But investing in start-up technology companies with big dreams and little real-world experience can be a risky proposition for providers who are looking to stretch every dollar they invest.  Venture capital investment in population health management and analytics companies is through the roof, but not every outfit that receives funding gets bought by a major player or scores a huge IPO.

Healthcare organizations should look for vendors who have secured adequate funding for their products, have working, bug-free examples of their software or hardware to display, offer robust customer support services, have firm timelines and plans for implementation, and don’t make promises they seem unlikely to be able to keep.

The ability to expand and grow with you as strategic plans change

Healthcare organizations are constantly being bombarded with new initiatives, shifting goals for federal mandates, and major changes to health IT programs, reimbursement structures, and quality improvement goals.  As the industry begins to embrace value-based payments and care structures driven by the need to provide high quality services and produce better outcomes, organizational needs and goals must be flexible.

Vendors have to be flexible, too, and be able to provide the right insights at the right time for the task at hand.  While technology turnovers are inevitable as new capabilities and standards move through the market, healthcare providers are looking for products that can carry them through at least a few years of turmoil without requiring a complete overhaul.

Healthcare providers can help themselves make the right choices by having a solid strategic vision for their organization over the next three to five years as meaningful use winds down and accountable care heats up.  Providers may wish to ask themselves:

  • How will I tackle population health management and the increasingly expensive proposition of caring for patients with complex chronic disease needs?  Will our patient demographics change significantly over the next few years?  How can we be proactive about addressing their needs?
  • How will the shift to value-based reimbursement drive the need for improved operational efficiencies within my organization, and how do I think big data will help?
  • What data exchange and interoperability capabilities do I need to ensure care coordination across the continuum?  How can my business partners and I work together to bring data-driven healthcare insights to our community?
  • What patient safety and care quality goals are we hoping to meet?  How can gaining deeper insights into our clinical care produce better patient outcomes?
  • What revenue cycle management issues do we need to address?  Can we turn patient behavior data into better collections, or will an investment in preventative care keep high-cost services to a minimum?
  • How can we improve our data integrity and data governance to maximize our investment in healthcare big data analytics?  Do we need to retrain our EHR users, hire more health information management professionals, or build a dedicated team of data scientists?

Healthcare big data analytics is such a rapidly expanding field that capabilities that seem commonplace today didn’t exist five years ago, and will probably be outdated five years from now.  But understanding your organizational objectives will help you make the best possible decisions with the information available at the moment, and hopefully set up your big data program for long-term future success.

Choosing the right vendor is a critical component of seeing the benefits of big data, and providers should not underestimate the degree to which open communication during this type of ongoing partnership will be required.

After thoroughly considering how a technology purchase will impact their goals, providers should look for stable, responsible, capable, and innovative vendors that offer high quality products with transparent, reasonable pricing structures if they wish to be pioneers in the field of big data.

Originally posted via “What to Look for in a Healthcare Big Data Analytics Vendor”


Google Offers ‘Preemptible’ Virtual Machines

If you don’t mind running a virtual machine (VM) that will live for just 24 hours, and can be shut down with only 30 seconds advance notice, then Google may have a great deal for you.

It’s offering Google Compute Engine Preemptible Virtual Machines, a short-lived VM type that’s in beta right now, according to a post on the Google Cloud Platform Blog. “Preemptible VMs are the same as regular instances except for one key difference – they may be shut down at any time,” wrote Paul Nash, Senior Product Manager.

Fixed Pricing — For a Price
By not giving a guarantee that the VM will be available, Nash said, Google can fix pricing for it, allowing a company to have more predictable costs. He said prices will start at $0.01 per core hour.
That low price comes attached with strings the size of skyscrapers, however. For one, the VM’s runtime is limited to 24 hours. And, if Google should need the space or resources being occupied by that VM, the VM will be terminated without prejudice.

That’s why Google says the best types of applications for these VMs are fault tolerant and can afford to be interrupted. It gives an example of batch processing jobs: “If some of those instances terminate during processing, the job slows but does not completely stop.”

Terminated Without Prejudice
Google adds that the probability that a running VM will be shut down is “generally low,” but it will probably happen from time to time. And those VMs will be terminated after running for 24 hours, no matter what.
There are two other important caveats:

In some instances, it may not even be possible to create a Preemptible VM; it depends on whether or not Google has Compute Engine resources available.
Preemptible VMs can’t live migrate, so if it’s shut down, it won’t be spinning back up on another host.
Nash blogged that Preemptible VMs can be created through the Google Developer Console, or by adding “–preemptible” to the gcloud command line. A Google pricing page shows that some of its cheapest per-hour VM charges can range anywhere between $0.03 per hour, up to $0.11 per hour or more. Thus, the Preemptible VMs could result in a substantial savings — for the right workload.

Originally posted at: https://virtualizationreview.com/articles/2015/05/18/google-offers-preemptible-virtual-machines.aspx

Source: Google Offers ‘Preemptible’ Virtual Machines by analyticsweekpick