Aug 27, 20: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data analyst  Source

[ AnalyticsWeek BYTES]

>> Getting Started with Predictive Analytics in Construction by analyticsweekpick

>> Predictive Analytics in Action: 5 Industry Examples by analyticsweek

>> How to integrate big data In security systems? by administrator

Wanna write? Click Here

[ FEATURED COURSE]

Data Mining

image

Data that has relevance for managerial decisions is accumulating at an incredible rate due to a host of technological advances. Electronic data capture has become inexpensive and ubiquitous as a by-product of innovations… more

[ FEATURED READ]

Hypothesis Testing: A Visual Introduction To Statistical Significance

image

Statistical significance is a way of determining if an outcome occurred by random chance, or did something cause that outcome to be different than the expected baseline. Statistical significance calculations find their … more

[ TIPS & TRICKS OF THE WEEK]

Save yourself from zombie apocalypse from unscalable models
One living and breathing zombie in today’s analytical models is the pulsating absence of error bars. Not every model is scalable or holds ground with increasing data. Error bars that is tagged to almost every models should be duly calibrated. As business models rake in more data the error bars keep it sensible and in check. If error bars are not accounted for, we will make our models susceptible to failure leading us to halloween that we never wants to see.

[ DATA SCIENCE Q&A]

Q:Explain what a local optimum is and why it is important in a specific context,
such as K-means clustering. What are specific ways of determining if you have a local optimum problem? What can be done to avoid local optima?

A: * A solution that is optimal in within a neighboring set of candidate solutions
* In contrast with global optimum: the optimal solution among all others

* K-means clustering context:
It’s proven that the objective cost function will always decrease until a local optimum is reached.
Results will depend on the initial random cluster assignment

* Determining if you have a local optimum problem:
Tendency of premature convergence
Different initialization induces different optima

* Avoid local optima in a K-means context: repeat K-means and take the solution that has the lowest cost

Source

[ VIDEO OF THE WEEK]

Understanding #Customer Buying Journey with #BigData

 Understanding #Customer Buying Journey with #BigData

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

The data fabric is the next middleware. – Todd Papaioannou

[ PODCAST OF THE WEEK]

@BrianHaugli @The_Hanover ?on Building a #Leadership #Security #Mindset #FutureOfData #Podcast

 @BrianHaugli @The_Hanover ?on Building a #Leadership #Security #Mindset #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Facebook users send on average 31.25 million messages and view 2.77 million videos every minute.

Sourced from: Analytics.CLUB #WEB Newsletter

Getting Started with Predictive Analytics in Construction

How current and historical data is bringing future insights to construction projects, and changing the course of the industry forever.

More and more, the industry is acknowledging that data plays an important role in construction. Still, projects produce massive quantities of data, and only a small portion of it is being used to inform decisions. One way of using data, however, is becoming more advanced and increasingly accessible for both small and large contractors: predictive construction analytics.

Using predictive analytics can help reduce risk and improve your decision making process. Read on to discover what predictive construction analytics are, why they’re important to the industry, and how you can start using these tools for better project outcomes.

Predictive Analytics 101

At the heart of predictive analytics is the ability to use current and historical data to forecast future outcomes. In other words, these tools make predictions about the future using techniques including statistical modeling and machine learning.

These techniques give the future insights generated by predictive analytics a significant degree of precision, especially with the use of machine learning. By generating algorithms based on current and historical data, machine learning is designed to solve business problems and streamline decision making, allowing you to choose the best path forward for your project.

Why Are Predictive Analytics Becoming More Important in Construction?

Each day, construction teams are managing a number of moving parts on site, from subcontractors to change orders, and beyond. The more complex construction projects become, especially in the era of social distancing and increased remote work, the more you need the kinds of tools that can take all available information into account and guide your next big decision. Enter, predictive construction analytics.

“What technology like data analytics, and even more specifically machine learning and artificial intelligence, is doing for us [construction] is unlocking our ability to harness the project data – organize it, interpret it to uncover patterns faster,” said Allison Scott, Director, Construction Thought Leadership & Customer Marketing at Autodesk, on a recent webinar. 

These tools can reduce issues, lower costs, and mitigate risk for construction projects by making the work more predictable. As an example, consider the preconstruction process. One of the biggest challenges for design teams during preconstruction is creating a realistic budget that can be applied to current and future project stages. On the construction side, teams frequently find it hard to manage the budget they receive from a project’s architecture or contractor teams. Predictive construction analytics allow preconstruction teams to create budgets that account for all possible factors that could emerge during a project, including regional labor and material costs, among other items.

Predictive analytics are poised to be a big part of the construction industry’s future. According to McKinsey & Company, solutions using predictive analytics, machine learning, and artificial intelligence will likely bring about major changes to how engineering and construction firms bid on and execute projects. Specifically, predictive analytics can help construction professionals answer questions around whether they should bid on a project, and if so, how much. These tools can also help determine if subcontractors’ bids are reasonable, and if a project is about to run into challenges. Predictive construction analytics can break down the costs and profitability of prior jobs, examine the accuracy of subcontractor bids received, and determine when and how past projects ran into trouble. All of this information can then generate the answers you’re looking for, before a new job has even begun.

Tips for Getting Started with Predictive Analytics in Construction

1. Hone in on our focus area

The best way to start implementing predictive analytics solutions for your next construction project is by first honing in on your area of focus. Going too broad in your adoption of predictive analytics can set you back, resulting in wasted time and disorganization. You should first determine one or two key focus areas where you want to bring in more predictability to your project. For example, do you want to better anticipate and mitigate safety and quality issues? Or perhaps you’d like more visibility into project risk, like budget overruns or labor challenges? Identify where you need more predictability and select a solution from there.

2. Find the right tools to measure

When it comes time to select the best predictive analytics tools, finding the right solutions based on your focus area can help you achieve your overall project goals. The right software for the construction industry can help with risk management around cost, schedule, quality, and safety. This solution can also help you evaluate subcontractor performance and mitigate day-to-day risks for future projects. Predictive analytics can also help safety managers understand the leading indicators to potential behavioral and environmental hazards, and take proactive measures before incidents arise. Moreover, a predictive analytics solution tailored to the construction industry can help executives identify risks across projects and take measures to improve project performance and set any job up for success.

3. Standardize and centralize

Finally, getting the most out of predictive analytics requires you to centralize and standardize your data. The higher quality your data input is, the higher quality, and thus better able to predict, your data output is. This is why it’s essential to establish a centralized data platform with standardized ways to input and structure information for accelerated accuracy in the predictive analytics solutions you use. Implementing a common data environment is one way to achieve this by allowing your team to optimize and utilize information when it’s needed most. Moreover, good data empowers future technologies, including machine learning and AI, to accelerate project delivery.

Predictive Analytics in Action

The use of predictive analytics tools in the construction industry has contributed to a number of successful project outcomes. Over the last few years, BAM Ireland, an operating company Royal BAM Group nv (BAM), has utilized BIM 360 Construction IQ, a predictive analytics software for the construction industry, to manage risk and streamline its workflows.

The software flagged a number of inconsistencies in BAM Ireland’s documents, including issues that were labeled as open despite being addressed and closed by project teams. Additionally, the system identified a number of critical issues that remained open, allowing the BAM Ireland team to address them before they became major challenges.

“A huge problem here for us is overdue issues,” Michael Murphy, digital construction operations manager at BAM Ireland, explained.

“If we fix these problems early, they’re cheaper to fix. If we start with a $25 issue that could be fixed in design, if that gets to construction, that increases to $250 to fix. If it’s spotted during snagging that will be $2,500. If it gets into operation it could cost $250,000. Knowing where the issues are early on is essential.

“If this system [Construction IQ] is taking a lot of heavy lifting away it’s giving us a laser sharp focus in terms of what the genuine health and safety issues are. We don’t have to explain how it works to the team; it just happens! Not only is it pointing at major issues, but it’s giving us more time.”

BAM Ireland has seen a 20% improvement in on-site quality and safety, and a 25% increase in staff time spent on high-risk issues since adopting Construction IQ as its predictive analytics solution. What’s more, as Construction IQ continues analyzing every BAM Ireland project, it is refining its prediction capabilities and improving the accuracy of its insights.

Predict Success Today

Predictive analytics can help you get organized and put your current and past project information to work toward success in the future. Finding the right predictive analytics solution for your next construction project starts with discovering how these tools can work for you. Learn more about available solutions and put predictive analytics to work for all of your future construction projects.

The post Getting Started with Predictive Analytics in Construction appeared first on Autodesk Construction Cloud Blog.

Originally Posted at: Getting Started with Predictive Analytics in Construction by analyticsweekpick

Data Governance and AI: New Dimensions in Privacy and Compliance

By now, many of us are familiar with the general tenets of data governance: inventorying what data lies where, controlling data access, protecting sensitive data, and documenting all of it. But in the context of machine learning (ML) and artificial intelligence (AI), new governance requirements arise.

Beyond identifying sensitive data, we must determine whether it can be analyzed in aggregate and discover if the aggregated data can be de-anonymized. Beyond securing datasets, we must determine who can build ML models on them. In addition to cataloging datasets, we need to monitor changes in them, then alert data scientists to retrain models based on those changes, or do so automatically based on policy.

These are just a few examples of AI-specific data governance concerns. To learn more, join us for this free 1-hour webinar from GigaOm Research. The Webinar features GigaOm analyst Andrew Brust and special guest, Will Nowak from Dataiku, an Enterprise AI and machine learning platform.

In this 1-hour webinar, you will discover:

  • Governance “checkpoints” for data scientists
  • The relationship between data regulations, ethics, and AI
  • Making ML models compliant, both with government regulations and corporate policy

Register now to join GigaOm Research and Dataiku for this free expert webinar.

Who Should Attend:

  • CIOs
  • CTOs
  • Chief Data Officers
  • Data Stewards
  • Data Scientists
  • Machine Learning Engineers
  • Data Engineers
  • Business Analysts

Originally Posted at: Data Governance and AI: New Dimensions in Privacy and Compliance

Aug 20, 20: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data interpretation  Source

[ AnalyticsWeek BYTES]

>> 6 Cool Companies Who Are Rethinking How We Work by analyticsweek

>> Processing Data like a Pro(fessional Data Scientist) by michael-li

>> Analytics Hall of Fame: Voices of the Famous Interview Series: What constitutes an Analytics Leader? The importance of Business Problem Framing. by tony

Wanna write? Click Here

[ FEATURED COURSE]

Learning from data: Machine learning course

image

This is an introductory course in machine learning (ML) that covers the basic theory, algorithms, and applications. ML is a key technology in Big Data, and in many financial, medical, commercial, and scientific applicati… more

[ FEATURED READ]

Python for Data Analysis: Data Wrangling with Pandas, NumPy, and IPython

image

Python for Data Analysis is concerned with the nuts and bolts of manipulating, processing, cleaning, and crunching data in Python. It is also a practical, modern introduction to scientific computing in Python, tailored f… more

[ TIPS & TRICKS OF THE WEEK]

Keeping Biases Checked during the last mile of decision making
Today a data driven leader, a data scientist or a data driven expert is always put to test by helping his team solve a problem using his skills and expertise. Believe it or not but a part of that decision tree is derived from the intuition that adds a bias in our judgement that makes the suggestions tainted. Most skilled professionals do understand and handle the biases well, but in few cases, we give into tiny traps and could find ourselves trapped in those biases which impairs the judgement. So, it is important that we keep the intuition bias in check when working on a data problem.

[ DATA SCIENCE Q&A]

Q:Explain selection bias (with regard to a dataset, not variable selection). Why is it important? How can data management procedures such as missing data handling make it worse?
A: * Selection of individuals, groups or data for analysis in such a way that proper randomization is not achieved
Types:
– Sampling bias: systematic error due to a non-random sample of a population causing some members to be less likely to be included than others
– Time interval: a trial may terminated early at an extreme value (ethical reasons), but the extreme value is likely to be reached by the variable with the largest variance, even if all the variables have similar means
– Data: “cherry picking”, when specific subsets of the data are chosen to support a conclusion (citing examples of plane crashes as evidence of airline flight being unsafe, while the far more common example of flights that complete safely)
– Studies: performing experiments and reporting only the most favorable results
– Can lead to unaccurate or even erroneous conclusions
– Statistical methods can generally not overcome it

Why data handling make it worse?
– Example: individuals who know or suspect that they are HIV positive are less likely to participate in HIV surveys
– Missing data handling will increase this effect as it’s based on most HIV negative
-Prevalence estimates will be unaccurate

Source

[ VIDEO OF THE WEEK]

Harsh Tiwari talks about fabric of data driven leader in Financial Sector #FutureOfData #Podcast

 Harsh Tiwari talks about fabric of data driven leader in Financial Sector #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Getting information off the Internet is like taking a drink from a firehose. – Mitchell Kapor

[ PODCAST OF THE WEEK]

Scott Harrison (@SRHarrisonJD) on leading the learning organization #JobsOfFuture #Podcast

 Scott Harrison (@SRHarrisonJD) on leading the learning organization #JobsOfFuture #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

94% of Hadoop users perform analytics on large volumes of data not possible before; 88% analyze data in greater detail; while 82% can now retain more of their data.

Sourced from: Analytics.CLUB #WEB Newsletter

[Step-by-step] Using Talend for cloud-to-cloud deployments and faster analytics in Snowflake

For the past two years, Snowflake and Talend have joined forces developing deep integration capabilities and high-performance connectors so that companies can easily move legacy on-premises data to a built-for-the-cloud data warehouse.

Snowflake, which runs on Amazon Web Services (AWS), is a modern data-warehouse-as-a-service built from the ground up for the cloud, for all an enterprise’s data, and all their users. In the first part of this two-part blog series, we discussed the use of Talend to bulk load data into Snowflake. We also showcased the ability to use Talend to perform ELT (Extract, Load, Transform) functions on Snowflake data, allowing you to take full advantage of Snowflake’s processing power and ease-of-use SQL querying interface to transform data in place.

This second video highlights Talend’s ability to harness the power of pure cloud deployments, making it possible to keep your data in place until it’s needed in Snowflake. By running Talend jobs directly in the cloud, no data is ever processed client-side, no remote processing is required. As a result, the governance and restrictions that you have implemented to secure your data remain intact. In addition to the security benefits, you get the full computing performance of Snowflake. Once that data is required in Snowflake, it is moved or copied seamlessly and directly into Snowflake from your cloud provider location.

The video walks you through the entire process. From extracting data from an Amazon S3 bucket to moving that data into a Snowflake data warehouse using Talend Cloud. Talend Cloud provides an easy-to-use platform to process this data in the cloud, and then leverage the power and ease-of-use of Snowflake to access and analyze that data in the cloud.

Talend Cloud with Snowflake delivers cloud analytics 2 to 3 times faster, in a governed way.

<Start your free 30-day trial of Talend Cloud here.>

[youtube https://www.youtube.com/watch?v=qlZ-ptREgSw?start=150]

 

The post [Step-by-step] Using Talend for cloud-to-cloud deployments and faster analytics in Snowflake appeared first on Talend Real-Time Open Source Data Integration Software.

Source: [Step-by-step] Using Talend for cloud-to-cloud deployments and faster analytics in Snowflake

Social Engineers are No Match for Artificial Intelligence

We are still in the early days of artificial intelligence, but it is quickly becoming an essential part of how organizations defend themselves. Using advanced algorithms, enterprises are improving incident response, monitoring for potential threats and deciphering red flags before they take effect. It can also be used to help identify vulnerabilities that a human […]

The post Social Engineers are No Match for Artificial Intelligence appeared first on TechSpective.

Source: Social Engineers are No Match for Artificial Intelligence by administrator

Aug 13, 20: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Trust the data  Source

[ AnalyticsWeek BYTES]

>> May 24, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> Aug 08, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> What Does “Explainable AI” Mean for Your Business? by administrator

Wanna write? Click Here

[ FEATURED COURSE]

Applied Data Science: An Introduction

image

As the world’s data grow exponentially, organizations across all sectors, including government and not-for-profit, need to understand, manage and use big, complex data sets—known as big data…. more

[ FEATURED READ]

The Future of the Professions: How Technology Will Transform the Work of Human Experts

image

This book predicts the decline of today’s professions and describes the people and systems that will replace them. In an Internet society, according to Richard Susskind and Daniel Susskind, we will neither need nor want … more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:What is: lift, KPI, robustness, model fitting, design of experiments, 80/20 rule?
A: Lift:
It’s measure of performance of a targeting model (or a rule) at predicting or classifying cases as having an enhanced response (with respect to the population as a whole), measured against a random choice targeting model. Lift is simply: target response/average response.

Suppose a population has an average response rate of 5% (mailing for instance). A certain model (or rule) has identified a segment with a response rate of 20%, then lift=20/5=4

Typically, the modeler seeks to divide the population into quantiles, and rank the quantiles by lift. He can then consider each quantile, and by weighing the predicted response rate against the cost, he can decide to market that quantile or not.
“if we use the probability scores on customers, we can get 60% of the total responders we’d get mailing randomly by only mailing the top 30% of the scored customers”.

KPI:
– Key performance indicator
– A type of performance measurement
– Examples: 0 defects, 10/10 customer satisfaction
– Relies upon a good understanding of what is important to the organization

More examples:

Marketing & Sales:
– New customers acquisition
– Customer attrition
– Revenue (turnover) generated by segments of the customer population
– Often done with a data management platform

IT operations:
– Mean time between failure
– Mean time to repair

Robustness:
– Statistics with good performance even if the underlying distribution is not normal
– Statistics that are not affected by outliers
– A learning algorithm that can reduce the chance of fitting noise is called robust
– Median is a robust measure of central tendency, while mean is not
– Median absolute deviation is also more robust than the standard deviation

Model fitting:
– How well a statistical model fits a set of observations
– Examples: AIC, R2, Kolmogorov-Smirnov test, Chi 2, deviance (glm)

Design of experiments:
The design of any task that aims to describe or explain the variation of information under conditions that are hypothesized to reflect the variation.
In its simplest form, an experiment aims at predicting the outcome by changing the preconditions, the predictors.
– Selection of the suitable predictors and outcomes
– Delivery of the experiment under statistically optimal conditions
– Randomization
– Blocking: an experiment may be conducted with the same equipment to avoid any unwanted variations in the input
– Replication: performing the same combination run more than once, in order to get an estimate for the amount of random error that could be part of the process
– Interaction: when an experiment has 3 or more variables, the situation in which the interaction of two variables on a third is not additive

80/20 rule:
– Pareto principle
– 80% of the effects come from 20% of the causes
– 80% of your sales come from 20% of your clients
– 80% of a company complaints come from 20% of its customers

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with  John Young, @Epsilonmktg

 #BigData @AnalyticsWeek #FutureOfData #Podcast with John Young, @Epsilonmktg

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data is not information, information is not knowledge, knowledge is not understanding, understanding is not wisdom. – Clifford Stoll

[ PODCAST OF THE WEEK]

Discussing #InfoSec with @travturn, @hrbrmstr(@rapid7) @thebearconomist(@boozallen) @yaxa_io

 Discussing #InfoSec with @travturn, @hrbrmstr(@rapid7) @thebearconomist(@boozallen) @yaxa_io

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

By then, our accumulated digital universe of data will grow from 4.4 zettabyets today to around 44 zettabytes, or 44 trillion gigabytes.

Sourced from: Analytics.CLUB #WEB Newsletter

Mobile Strategy For Brick-Mortar Stores And How Retailers Are Doing It All Wrong

No, we are not talking about mobile strategy for retailers but a sub-part of it, that is – Mobile strategy for brick-mortar stores. Yes, it is different from the overall mobile strategy for the retailer and no, it cannot be done correctly without thinking differently from online store strategy. With ever increasing number of smartphone and affordable data plans, it would be a bad move not to think about mobile strategy for retailers. I hope retailers are already aware that it is critical for their business to have a mobile strategy. There are tons of material already written to suggest why. What I have yet to discover is separate mobile strategy for retailer to help their brick-mortar stores.

For number enthusiasts:
Comscore, as of January, 2012, 101.3 million people in the United States have a smartphone. That’s almost one in every three Americans! While Nielsen has this number to be around 43%. Fitch Ratings also predicts that by end of 2012 2/3rd that is approximately 66% of US population will be using smartphones. The U.S. Census Bureau recently announced that eCommerce was responsible for 48.2 billion dollars in sales during the third quarter of 2011.
A market research by Vibes’ shows that mobile technology plays a vital role for in-store shoppers:

  • 84% of shoppers have conducted in-store product research via smartphone
  • Nearly half of all consumers feel more confident about their purchasing decisions after pulling up additional product information on their mobile phones
  • 33% admitted to searching a competitor’s website for better deals while in-store
  • 6% of consumers said they were likely to abandon an in-store purchase for a competing offer

Interestingly, brick-mortar stores are not doing great despite many of them investing in a good mobile marketing strategy. The reason being – one size fits all approach used by the retailers. Many have just one app that handles the overall retail experience, online presence as well as brick-mortar stores. So, retailers should refocus their mobile strategy and break the overall plan into two parts: Online mobile strategy and brick-mortar store strategy. Both areas have their respective focus areas. One primarily caters to online mobile surfers, while other caters to visitors seeking help in brick-mortar stores. No, it is not necessary to design it as 2 different apps, it could very well be integrated into one app, but design, feature consideration should be specifically designed to also keep in mind the needs of the store visitors. While doing this from the same app, application framework  and app should be smart enough to identify the traffic. As a starter, it would be a good idea to experiment ans test it using QRCode backed website, which could later be integrated into app strategy if the workflows and used cases are identified and validated.

Following strategy design considerations would go a long way in building a strong brick-mortar retail specific mobile strategy:

1. Include all possible used cases needed by store visitors and wanderers:
It is important to understand what are the most promising features required by users who are surfing the store or wandering around the areas. The used case may include – learning more about products, asking for help, searching for an accessories, price match etc. May be hiring some shoppers or doing focus groups could provide some starting ground. With this savvy data age, I am strictly against focus group, but surely, it could work great as a starting point. It is also important to restrict the research and findings to areas that are impacting store workflows only.

2. Connect Online store with off-line through seamless layer:
Considering expanding mobile landscape, it is important not to lose sight of the bigger picture and the overall mobile strategy. So, seamless connectivity between store specific workflows and online store workflows provides easy maneuverability to users. The goal should be to keep users satisfied by fulfilling all their needs and thereby keeping their business confined within store commerce. Certain examples are: providing product availability online, having product shipped to home for free from online etc. So, it is important to compensate shortfalls from one channel with the other i.e. Online workflows and Brick-Mortar store workflows respectively.

3. Provide ability to leave feedback,suggestion, grievances etc.:
Learning is an important part for any business. With evolution in data tools, there is no excuse for anyone not to leverage it for personal benefits. Any possible auto-learning opportunities must be incorporated. A good customer experience management strategy provides list of those surveys and learning manuals that should just be enabled in mobile framework at appropriate workflow touch points. Having done that, retailers will not need anything else but this self learning mechanism to evolve with changing market dynamics. This can lead to sustainable business growth.

4. Reward visitors for enhancing usage:
Certainly, a usage will provide so many other opportunities to stores such as better learning, better chances for referrals, recommendations etc. With that in mind, retailers should provision for some reward system to encourage the use of mobile products and a good design framework should provide mechanics for integrating some reward system. This could be done by providing store credits, coupons etc. It is important that some learning should also be done on which rewards works at which stage.

5. Provide seamless presence and connectivity with other social platforms:
It is not a surprise that there are many other better, reliable local presence social apps being used by users. Some examples being, face book, foursquare, yelp etc. It is important for store mobile strategy to incorporate some alliance with those framework as well. There should be a customized and altered to attract visitors. The sooner stores get in those lines, the more adoption will they receive.

So, get the right gears and move onto building a robust brick-mortar store mobile strategy, that helps stores learn faster and move with changing customer landscapes.

Source by v1shal

Leadership Profiles: New CIOs Take the Reins in 12 States

microphone on a dais in front of a room of empty seats

While the first few months of 2020 were notable to be sure, they were perhaps especially daunting for the new permanent state chief information officers who stepped up to their posts amid the turmoil of the novel coronavirus. But the business of government IT pressed on, and while these CIOs were tasked with challenges like quickly transitioning to telework, shoring up cybersecurity and tracking COVID-19 data, the nuts and bolts of gov tech remained: modernizing legacy systems, expanding broadband and coordinating with state leadership on long-term priorities.

Our editorial staff got to know this new class of state CIOs, checking in as they settled into their positions at an unprecedented moment in history.

[slideshow-break]

Tracy Barnes, Indiana

Transparency is a cornerstone of the leadership style Tracy Barnes brings to his position as the head of Indiana’s Office of Technology (IOT). He prefers to engage fully with staff at all levels of the organization, sharing what he knows and relying upon their expertise to make the best possible decisions. But he started the job at the end of March, the same month the state saw its first case of COVID-19. That reality has made the kind of communication he seeks a little harder.

“I want to figure out how I make sure I get that message out there in a comprehensive manner to the full footprint of the agency so everyone understands the value that they’re bringing, especially in this time where tensions are high, pressure is high and expectations are high,” he said. “The ability to walk up and down the aisleways and just say ‘hi’ or ‘good job’ or ‘attaboy’ are limited and probably won’t be available at any true capacity for a while.”

Barnes brings a multi-faceted background to the CIO job, having worked extensively in the ERP area in private industry and higher education both in the United States and abroad. He made the move to government a few years ago as IT director for the state auditor’s office, then as chief of staff for the lieutenant governor. As state CIO, Barnes acknowledges the strong foundation built through the expertise of his predecessors and looks forward to bringing his enterprise-level skills to bear in order to move all agencies forward using technology.

A proponent of as-a-service technologies, Barnes envisions that establishing a sustainable, supportable multi-cloud offering will be a critical part of meeting the state’s needs. He wants to solidify IOT’s role in guiding agencies toward secure solutions that fit within the broader operational IT framework, especially in areas like data integration. Noting that technology investments tend to outlive the agency leadership that was in place at the time of the purchase, he takes a longer-term view of IOT’s responsibility.

“We need to make sure that folks still at the state can continue supporting them and managing them and maintaining them for potential turnover and for succession planning down the road,” he said.

— Noelle Knell

[slideshow-break]

Annette Dunn, Iowa

Annette Dunn is no stranger to the inner workings of government. She was named by Gov. Kim Reynolds to the CIO role in July of 2019, following a four-year stint as IT division director and CIO of the state Department of Transportation. And her DOT post came on the heels of nearly a decade in other roles with the state — notably among those as a key player in a statewide project to equip snowplows with GPS and advanced vehicle location technology that has since been used in a number of other states.

Taking on the state CIO job comes with similar challenges as roles she’s previously held, Dunn said, just on a bigger scale. And rising costs and flat or declining budgets place even more pressure on IT resources.

“We must provide the innovation and access to Iowans that they expect and need,” she said. “This puts a larger burden on the use of data and technology to help us make more strategic decisions and think outside of the box to be able to deliver services in more convenient and customer-friendly ways.”

Getting a handle on the state’s data is a major priority for Dunn. She’s eyeing a robust data warehouse that can be relied upon as a resource to users across the state in order to inform the best possible business decisions. And there’s work to be done to get there: getting a clear picture of the data held by various state systems; deduplication and standardization; and establishing access controls.  

“The creation of a strong, reliable data warehouse that is easily utilized will make us a stronger state and help us make better business decisions now and well into the future,” she added.

Her approach to leading the broader IT organization is to look both outward and inward. She sees vendor partners as playing a critical role in helping agencies meet their technology needs, as they can move more quickly and efficiently, often at a lower cost. But pivoting away from internal development is a big cultural shift, which explains why internal communication is another huge area of focus.

“From a leadership standpoint, it comes down to changing the culture and helping people see the big picture,” she explained. “I spend a lot of time convincing employees that change is just a different way of doing things, and at the end of the day there will always be work for them to do that is important and necessary.”

— Noelle Knell

[slideshow-break]


Jeff Wann, Missouri

Jeff Wann was on the job two weeks when the coronavirus upended life in Missouri, and across the country, all but shutting down the state.

“And everything changed,” Wann remarked on the last day of April, as the state counted more than 7,500 confirmed cases of COVID-19, and 329 fatalities related to the disease. Missouri Gov. Mike Parsons declared a state of emergency on March 13.

Earlier this year, Wann was named Missouri’s new CIO, bringing a long career spanning the public, private and nonprofit sectors. He was tapped, in part, for his leadership in the job of modernizing Missouri’s Office of Administrative IT Services.

An overarching goal was “to help modernize the IT systems, and to help transform processes and procedures, and to help further mature the IT organization,” said Wann.

“The COVID-19 situation has helped us to accelerate that,” he said. “It’s a silver lining in a very dark cloud, because obviously, COVID-19 is a terrible thing for folks. But on the other hand, it’s been a catalyst to help us to be able to help our citizens. And frankly, help other states.”

The crisis required quick action in a number of areas. Tools like chatbots, which can take months to develop, were being turned out in only weeks and days. The state teamed up with the Missouri Hospital Association to launch a new tool, developed by Google, to form a marketplace that matches state suppliers of personal protective equipment with health-care workers. Telephone, GIS and other systems were upgraded and improved to meet the new challenges the crisis called for.

When Missouri does return to more normal operations, Wann plans to return to his punch list for modernizing IT.

“Now, it’s going to be tough,” he added. “Because projected revenue in fiscal year ’21 is not looking good for any state.”

“But I expect to keep going,” Wann continued. “I expect that now that we are training our people in these new technologies, we can continue on doing those things with the budgets as they are.”

— Skip Descant

[slideshow-break]

Brom Stibitz, Michigan

When he started as Michigan CIO on March 4, Brom Stibitz was prepared. A lifelong resident of the state, besides a few years overseas after college, Stibitz had been chief deputy director of the Michigan Department of Technology, Management and Budget for five years. Before that he had been director of executive operations for the state Department of Treasury, a senior policy adviser, and a legislative director at the state House of Representatives. He was was ready to hit the ground running.

Then the pandemic hit.

Like most state CIOs, Stibitz had to set aside what he thought he’d be doing this spring and instead manage organization-wide emergency measures, including telework on a scale that Michigan had never attempted before. He spent much of those first weeks overseeing preparations for nearly 28,000 state employees: doubling VPN firewall capacity, finding laptops and organizing staff training on various tools for working from home.

He defines his broader priorities for Michigan as efficient and effective government, IT accountability, customer experience, and (of course) cybersecurity. He said those weren’t explicit directives from Gov. Gretchen Whitmer, but they appeared to be shared goals among state departments.

“There’s been a lot of focus on, how do we make sure that services of the state are accessible to people, not just that it’s there and people can use it, but how do you make it so people can understand it and it’s truly accessible?” he said. “The other area of focus has been efficiency … How do we make sure that we’re consolidating around solutions instead of expanding the state’s footprint?”

Asked what recent IT projects he’s most glad to have done, Stibitz mentioned a couple that weren’t flashy, but critical: developing a single sign-on solution, now used by more than 200 applications, to simplify security; and transitioning state employees to Microsoft Office 365, which reduced their reliance on network storage, revealed several practical and budgetary efficiencies, and unwittingly prepared the state to work from home.

In April, Stibitz was fairly sanguine about the results of the state’s telework operation, but under no illusions about the economic challenges to come.

“We’re looking at precipitous declines in revenue over the next six, 12, 18 months,” he said. “So there’s going to be more pressure than ever on IT to (a) find efficiencies within what we’re doing, and (b) to find solutions that can help agencies or customers save money.”

— Andrew Westrope

[slideshow-break]

John Salazar, New Mexico

When John Salazar became New Mexico’s IT secretary on March 2, he knew he’d have to address issues such as dated infrastructure and broadband access.

What he didn’t know was that his new role would soon revolve around responding to a pandemic that would infect more than 1 million Americans within two months. The workload has been gigantic.

“We’re working weekends. We’re working nights. It’s been a challenge for us,” Salazar said.

The first big hurdle was ensuring that roughly 20,000 government employees could work from home. Salazar thinks the mission was accomplished, but not without hiccups. The state filed an emergency order with a vendor for 1,000 laptops, but the machines came a month late, so Salazar’s team had to get creative in an IT structure where agencies manage their own networks and workstations.

“The first couple of weeks were chaos,” Salazar recalled. “We were all working in different directions.”

He spent a lot of time in April collaborating with the New Mexico Department of Health, which has two legacy systems that receive COVID-19 testing data from the Centers for Disease Control and Prevention. The goal was to create a dashboard with relevant information for Gov. Michelle Lujan Grisham, which required Salazar’s team to, among other steps, stand up a platform in the cloud and develop data interfaces between the legacy systems.

Having previously worked as a CIO in two different state agencies — Taxation and Revenue, and the Department of Workforce Solutions — Salazar was well prepared for his position as head of New Mexico IT. But no one could foresee the long-term impact of COVID-19. Salazar attempted to compare the situation to Y2K, but he pointed out that at least with Y2K, there was a “long climbing process” during which people knew what was potentially coming.

Now leaders like Salazar must react in ways that might forever change how states utilize technology. He sees plenty of opportunities to improve New Mexico’s systems by incorporating more cloud solutions, automating more processes and putting in place more procedures for better cybersecurity.

As for government meetings, New Mexico is holding a tremendous amount of virtual sessions — a new trend that perhaps should be a norm.

“All of our employees are doing this on a regular basis, and that’s something that probably needs to continue in the future,” Salazar said.

— Jed Pressgrove

[slideshow-break]

Tracy Doaks, North Carolina

Tracy Doaks is a self-proclaimed technologist at heart. “I love talking about it, translating that in business terms so that I can talk about it with different audiences, understanding the finance side of it,” she said.

That was essential in her last four years as deputy CIO in North Carolina, where she primarily focused on back-of-house operations like data centers, the state network, and cloud and identity management. Since she took the head post as CIO in March, Doaks has had to pivot to more outward-facing work, coordinating with the governor’s office and doing more public speaking. “Now my focus has expanded to all facets of the Department of Information Technology [DIT],” she explained, “so that includes cybersecurity, data analytics, rural broadband, 911, digital transformation.”

Doaks has spent 20 years in and out of government, in the North Carolina Department of Revenue, where she was CIO, as well as time in health care and with Accenture. That experience meant that when she became state CIO, just as COVID-19 was gaining ground in the U.S., she had a firm grasp on how government IT works and the role it would play in adapting to the pandemic. DIT quickly stood up a crisis response team that met twice daily, “so that we could understand obstacles and challenges around the state and knock them down quickly.”

Some of those challenges included putting together a coronavirus website in just a few days to help pull heavy Internet traffic away from Health and Human Services, as well as supporting similar issues at the Division of Employment Security.

And the pandemic of course heightened the need to expand broadband connectivity, particularly throughout the state’s rural areas, which Doaks had to quickly get up to speed on. “As the schoolchildren and college kids were sent home and employees were sent home,” she said, “that made it even more critical and a top priority for us.”

— Lauren Harrison

Editor’s note: After the July/August issue of Government Technology went to press, Doaks stepped down as CIO to helm a tech-related nonprofit in North Carolina.

[slideshow-break]

Jerry Moore, Oklahoma

Jerry Moore took the reins as Oklahoma’s new CIO in February, at a time when the state’s IT organization and direction was shifting.

Gov. Kevin Stitt, who appointed Moore, has made it known he is pursuing a new direction for Oklahoma IT, prioritizing digital transformation and modernization as two of the main efforts of his administration.

Part of this has involved a reorganization of the state’s Office of Management and Enterprise Services: OMES is in the process of conducting an audit meant to identify unnecessary expenditures, which was ongoing when Moore came on the scene.

Having spent a decade as the CIO for the Tulsa Technology Center — the educational IT hub affiliated with the Oklahoma Department of Career and Technology Education — Moore is no outsider to government work. Before becoming CIO, he also worked as the state’s director of IT application services.

At the same time, it is his private-sector experience that has likely given Moore the skill set that is most appealing in light of the governor’s effort: Having held IT leadership roles for large companies like ConocoPhillips and SiteTraxx, Moore has consistently shown an ability to take on IT restructuring projects that hew to long-term strategic goals.

Stitt has said this is what he hopes Moore will bring to the job: an ability to deliver high-performance, cost-effective solutions as the state navigates its modernization efforts.

“Jerry’s more than 20 years of experience in technology leadership in the public and private sectors will serve Oklahomans well as we continue our efforts in becoming a top ten state,” said OMES Director Stephen Harpe in a statement. “He has a proven record in identifying and executing new technologies to solve business problems.”

— Lucas Ropek

[slideshow-break]

Jeffrey Clines, South Dakota

A drive to help people, rather than increase margins and decrease bottom lines, is what brought Jeffrey Clines to public-sector work. He began his career in the private sector, then spent more than a decade in operations and enterprise applications for the American Heart Association. But even that nonprofit work didn’t fully get at Clines’ desire to impact real people’s lives. In 2018 he moved to government, as CIO for the Illinois Secretary of State’s office, before becoming head of the South Dakota Bureau of Information and Telecommunications this past April.

“I believe that in technology, you can’t stay in one place,” Clines wrote in an email to Government Technology. “We push forward, finding ways to leverage technology — especially emerging tech — to improve service, systems and processes.” He’s committed to working with state agencies to target where they want to go and see how tech will help get them there.

“There is no stable ground anymore,” he said. “The days of being able to stand up a system and have it work for 20-plus years are no longer here.”

That forward-looking approach to state IT has so far served him well during his tenure, which of course began amid the COVID-19 pandemic as South Dakota moved to nearly all remote work. When Clines thinks about when the U.S. will “return to normal,” he hopes it doesn’t. “If we push to go back to where we were, we may lose some of the valuable lessons we have learned in this process — things like the ability of our teams to work independently and remotely, or how we have really pushed to look at ways to be creative in delivering services while maintaining social distancing.”

One area where Clines sees this having the most impact is on rural communities as people are given the option to work away from city centers and be just as productive. South Dakota is home to many scenic, far-flung areas that Clines hopes to reinvigorate via telework, which he notes will need investment in broadband and other critical infrastructure to thrive.

— Lauren Harrison

[slideshow-break]

Bill Smith, Alaska

Bill Smith took up the role of Alaska’s state CIO in late 2019, a turbulent time for the state IT department. He became the fifth person to hold the job since 2018, after a series of interim leaders.

Why the turnover? The state had launched an effort to centralize its technology offerings, and the process “didn’t go as well as everybody wanted,” Smith said. The job nonetheless appealed to him because, as he saw it, the fundamentals had begun to fall into place.

“The state leadership is very supportive, from the governor to all the cabinet-level department leaders. We’ve also brought in external resources, where before the centralization effort was being done entirely in house,” he said. “I believe the conditions exist now for us to have long-term success with this effort.”

Those external resources included a third-party assessment of the overall IT ecosystem, which Smith is now leveraging as the basis for prioritization. He also is busy updating all the technology-position descriptions to align with a modernized IT environment, and he’s taking a fresh look at the service catalog.

“We want to build out the organizational chart more fully, to nail down what the services are, how we will provide those and with what resources,” he said. This will include an overhaul of IT governance, with an eye toward creating a more collaborative relationship between the Office of Information Technology and state agencies.

“I’d also like to be able to be the broker,” he said. “If the departments have an IT need, my team can figure out what services are available, so that the departments can focus on meeting their own business needs.”

— Adam Stone

[slideshow-break]

J.R. Sloan, Arizona

J.R. Sloan joined Arizona state government in 2013 as manager of the digital government program. He moved to the deputy CIO position, and then became interim CIO in July 2019. When he officially assumed the CIO role in March 2020, he set cybersecurity as a top goal.

“In Arizona we have a federated model, where agencies have a large degree of autonomy and independence,” he said. To counter the cyber-risk inherent in that model, Sloan has been implementing an enterprise approach to security, standardizing agencies on a set of 16 different security controls.

While legislative funding has helped drive the change, “we really needed to get the agencies involved in the process,” he said. “We set up a committee with security officers from each of the agencies, all bringing front-line knowledge that we can use that to guide the process.”

Looking forward, he said the IT department will try to expand that cooperative mentality across a broader range of IT needs. “We can bring enterprise services to the agencies and save them having to execute on those tasks themselves, so that they can redirect those resources to actually work on the mission of the agency.”

To that end, the state already was engaged in a move to Google’s G Suite for mail and calendar, with 40,000 employees across 80 different agencies onboarded in two years, Sloan said. Now he is looking to expand that implementation to include things like document editors and chat features.

“All this stuff is included in the suite that we’ve already paid for,” he said. Beyond the financial pitch, he’s been wooing agencies on the basis of functionality. “Within the suite, everything is connected and it all works well together. When you can have the same document up with someone else and you can collaborate in real time — that’s when we really catch people’s attention.”

— Adam Stone

[slideshow-break]

DeAngela Burns-Wallace, Kansas

DeAngela Burns-Wallace was already heading up the cabinet-level Department of Administration in Kansas when the governor tapped her to take on an added role as leader of the independent Office of Information Technology Services (OITS).

In her new position, which she’s held since August 2019, Burns-Wallace has staked out a number of key goals. She’s looking to modernize legacy systems and putting a heavy emphasis on security. “Our security posture is solid, but I want us to not just be ‘in the moment,’” she said. “I want us to be in a more strategic stance, strengthening our overall security posture not just as individual agencies but in a coordinated way across state government.”

Data governance also is high on her agenda. “We have some data sharing that has come together out of necessity, but now we need to take a step back and put a real strategy and structure around that,” she said. “We need to put in place sustainable guidelines and policies that aren’t susceptible to changing leadership or changing political winds.”

Finally, Burns-Wallace said she is looking to elevate the perception of IT as a dependable partner across all levels of state government. “We have to have reliable, consistent high-quality IT services across all of our agencies. But over the years, that consistency and level of quality have been uneven,” she said. “Non-cabinet agencies for instance have not always gotten the same level of service, and yet their work is incredibly important. They have a significant impact for our state.”

Going forward, Burns-Wallace said OITS needs to establish a more level playing field, in order to change perceptions at the agency level. “There needs to be a reliably high level of service across all those entities,” she said. “That’s how we rebuild trust and confidence in what IT is delivering.”

— Adam Stone

[slideshow-break]

Ruth Day, Kentucky

Ruth Day took up the helm at Kentucky’s Commonwealth Office of Technology (COT) in December 2019 amid a flurry of bad news.

Two months earlier, a state auditor faulted COT’s inventory practice, saying the agency couldn’t account for some $715,000 worth of equipment. Then in November, a contract worker with access to COT’s storage rooms was indicted for allegedly stealing more than $1 million in laptops from the agency.

Despite the potentially fragile environment, Day rose to the challenge of helping state agencies respond to the COVID-19 outbreak just a few months after taking on her new role. In mid-March she issued a memo to guide officials in their use of online meeting platforms. When vulnerabilities appeared in the popular Zoom platform, she quickly followed up with further guidance.

Meanwhile, Day continues to lead COT in its efforts to address a number of key issues.  Supported by COT, state offices have begun to connect to KentuckyWired, a state-run project constructing high-speed fiber-optic infrastructure to every Kentucky county. Looking ahead, COT will be supporting the National Guard in mounting Cyber Protection Teams to secure the upcoming primary and general elections.

Prior to her appointment, Day served as the vice president for administrative services at Landstar System Inc., a transportation services company specializing in logistics. In a press conference at the time of her appointment, Day expressed enthusiasm for the work ahead. “I’m honored to join the [Gov. Andy] Beshear-[Lt. Gov. Jacqueline] Coleman administration and I think you can tell that the [governor] has laid out a very clear and concise mission for me …,” she said. “I’m very excited and ready to go to work for Kentucky.” 

— Adam Stone 

Source

Aug 06, 20: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! HTTP/1.0 404 Not Found
in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Convincing  Source

[ AnalyticsWeek BYTES]

>> What Is Customer Delight? by analyticsweek

>> Fast and Powerful Custom Reporting is the Focus of our Redesigned Analytics Engine by analyticsweek

>> Birth of Nike’s “Just Do It” involved a firing squad, a murderer [ Video ] by v1shal

Wanna write? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

How to Create a Mind: The Secret of Human Thought Revealed

image

Ray Kurzweil is arguably today’s most influential—and often controversial—futurist. In How to Create a Mind, Kurzweil presents a provocative exploration of the most important project in human-machine civilization—reverse… more

[ TIPS & TRICKS OF THE WEEK]

Grow at the speed of collaboration
A research by Cornerstone On Demand pointed out the need for better collaboration within workforce, and data analytics domain is no different. A rapidly changing and growing industry like data analytics is very difficult to catchup by isolated workforce. A good collaborative work-environment facilitate better flow of ideas, improved team dynamics, rapid learning, and increasing ability to cut through the noise. So, embrace collaborative team dynamics.

[ DATA SCIENCE Q&A]

Q:What is root cause analysis? How to identify a cause vs. a correlation? Give examples
A: Root cause analysis:
– Method of problem solving used for identifying the root causes or faults of a problem
– A factor is considered a root cause if removal of it prevents the final undesirable event from recurring

Identify a cause vs. a correlation:
– Correlation: statistical measure that describes the size and direction of a relationship between two or more variables. A correlation between two variables doesn’t imply that the change in one variable is the cause of the change in the values of the other variable
– Causation: indicates that one event is the result of the occurrence of the other event; there is a causal relationship between the two events
– Differences between the two types of relationships are easy to identify, but establishing a cause and effect is difficult

Example: sleeping with one’s shoes on is strongly correlated with waking up with a headache. Correlation-implies-causation fallacy: therefore, sleeping with one’s shoes causes headache.
More plausible explanation: both are caused by a third factor: going to bed drunk.

Identify a cause Vs a correlation: use of a controlled study
– In medical research, one group may receive a placebo (control) while the other receives a treatment If the two groups have noticeably different outcomes, the different experiences may have caused the different outcomes

Source

[ VIDEO OF THE WEEK]

@DrewConway on fabric of an IOT Startup #FutureOfData #Podcast

 @DrewConway on fabric of an IOT Startup #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

He uses statistics as a drunken man uses lamp posts—for support rather than for illumination. – Andrew Lang

[ PODCAST OF THE WEEK]

@TimothyChou on World of #IOT & Its #Future Part 2 #FutureOfData #Podcast

 @TimothyChou on World of #IOT & Its #Future Part 2 #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Data is growing faster than ever before and by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet.

Sourced from: Analytics.CLUB #WEB Newsletter