Assess Your Analytics UX in 3 Questions

An application can live or die by its embedded analytics. It doesn’t matter if the rest of your product is perfectly designed. If your dashboards and reports have a disappointing user experience (UX), user adoption and customer satisfaction can plummet.

“User experience matters,” writes Gartner in their recent report, 5 Best Practices for Choosing an Embedded Analytics Platform Provider. “Embedded analytics [should] not only support embedding of charts and visualizations, but also go deeper and integrate the data and analytics into the fabric of the application. This ‘seamless’ approach means that users don’t even know they are using a multiproduct application.”

>> Related: UX Design for Embedded Analytics <<

How solid is your analytics UX? Ask yourself these three questions to gauge where and how you can improve your analytics experience:

#1. Do you have a deep understanding of your users?

A lack of understanding about what users need from their dashboards and reports is a challenge that plagues product teams. Many companies fill their analytics with data they think their users want, and never do the due diligence to find out what users actually need.Don’t assume what your business intelligence users want. Take the time to research how users will interact with your application, so you can build it with them in mind. It’s a seemingly obvious but often-missed point: Different end users want to use your application’s embedded analytics in different ways.

#2. Does your embedding stop at the visualizations?

Embedded analytics involves more than white-labeling some charts and graphs. Application teams need to look at the complete experience—not just the visuals—to ensure end users can’t tell where your application ends and the embedded analytics begins.A truly seamless experience “allows users to take immediate action from within the application, without shifting context,” notes Gartner in their report. Ideally, you want to integrate the analytics into your users’ workflows by letting them take action from the analytics, write-back to the database, and share insights in context.

#3. Do the visualizations match your data?

Another common problem is choosing the wrong data visualizations to illustrate your datasets. Most visualizations are good for some types of data, but not every type. For example, a scatter chart works well to display two variables from a dataset, but it’s only useful when there is a number value on each axis; without that, it will appear to be a line chart without the line. Or consider the common pie chart, which is great for four or five values—but completely breaks down when sliced into dozens of sections. These are just two examples of how poor UI/UX can make information difficult for a user to understand.

If you’ve answered “yes” to any of the questions above, it’s time to update your analytics before your customers start abandoning your product for the competition. Learn how to take the next steps in our Blueprint to Modern Analytics guide.

Originally Posted at: Assess Your Analytics UX in 3 Questions by analyticsweek

Visualization’s Twisted Path

Visualization is not a straight path from vision to reality. It is full of twists and turns, rabbit trails and road blocks, foul-ups and failures. Initial hypotheses are often wrong, and promising paths are frequently dead ends. Iteration is essential. And sometimes you need to change your goals in order to reach them.

We are as skilled at pursuing the wrong hypotheses as anyone. Let us show you.

We had seen the Hierarchical Edge Bundling implemented by Mike Bostock in D3. It really clarified patterns that were almost completely obfuscated when straight lines were used. 

Edge Bundling

We were curious if it might do the same thing with geographic patterns. Turns out Danny Holten, creator of the algorithm, had already done something similar. But we needed to see it with our own data.

We grabbed some state-to-state migration data from the US Census Bureau, then found Corneliu Sugar’s code for doing force directed edge bundling and got to work.

To start, we simply put a single year’s (2014) migration data on the map. Our first impression: sorrow, dejection and misery. It looked better than a mess of straight lines, but not much better. Chin up, though. This didn’t yet account for how many people were flowing between each of the connections — only whether there was a connection or not. 

Unweighted edge bundled migration

Unweighted edge bundled migration

With edge bundling, each path between two points can be thought to have some gravity pulling other paths toward it while itself being pulled by those other paths. In the first iteration, every part of a path has the same gravity. By changing the code to weight the bundling, we add extra gravity to the paths more people move along.

Weighted edge bundled migration

Weighted edge bundled migration

Alas, things didn’t change much. And processing was taking a long time with all those flows. When the going gets tough, simplify. We cut the data into two halves, comparing westward flows to eastward flows.

East to west migration

East to west migration

West to east migration

West to east migration

Less data meant cleaner maps. We assumed there would be some obvious difference between these two, but these maps could be twins. We actually had to flip back and forth between them to see that there was indeed a difference.

So our dreams of mindblowing insight on a migration data set using edge bundling were a bust. But, seeing one visualization regularly leads to ideas about another. We wondered what would happen if we animated the lines from source to destination? For simplicity, we started with just eastward migration. 

Lasers

Lasers

Cool, it’s like laser light leisurely streaming through invisible fibre optic cables. But there’s a problem. Longer flows appear to indicate higher volume (which is misleading as their length is not actually encoding volume, just distance). So we tried using differential line lengths to represent the number of people, sticking with just eastward flows. 

Star Wars blasters

Star Wars blasters

Here we get a better sense of the bigger sources, especially at the beginning of the animation, however, for some paths, like California to Nevada, we end up with a solid line for most of the loop. The short geographic distance obscures the large migration of people. We wondered if using dashed lines would fix this—particularly in links like California to Nevada.

Machine gun bursts

Machine gun bursts

This gives us a machine gun burst at the beginning with everything draining into 50 little holes at the end. We get that sense of motion for geographically close states, but the visual doesn’t match our mental model of migration. Migrants don’t line up in a queue at the beginning of the year, leaving and arriving at the same time. Their migration is spread over the year.

What if instead we turn the migration numbers into a rate of flow. We can move dots along our edge bundled paths, have each dot represent 1000 people and watch as they migrate. The density of the dots along a path will represent the volume.  This also has the convenience of being much simpler to explain.

Radar signals

Radar signals

We still have a burst of activity (like radar signals) at the beginning of the loop, so we’ll stagger the start times to remove this pulsing effect.

Staggered starts

Staggered starts

Voilà. This finally gives us a visual that matches our mental model: people moving over the period from one state to another. Let’s add back westward movement.

Ants

Ants

Very cool, but with so much movement it’s difficult to tell who’s coming and who’s going. We added a gradient to the paths to make dots appear blue as they leave a state and orange as they arrive.

Coloured ants

Coloured ants

Let’s be honest, this looks like a moderately organized swarm of ants. But it is a captivating swarm that people can identify with. Does it give us any insight? Well not any of the sort we were originally working for. No simple way to compare years, no clear statements about the inflows and outflows. If we want to make sense of the data and draw specific conclusions… well other tools might be more effective.

But it is an enchanting overview of migration. It shows the continuous and overwhelming amount of movement across the country and highlights some of the higher volume flows in either direction. It draws you in and provides you with a perspective not readily available in a set of bar charts. So we made an interactive with both.

Each dot represents 1,000 people and the year’s migration happens in 10 seconds. Or if you’d prefer, each dot can represent 1 person, and you can watch the year play out in just over 2 hours and 45 minutes. If you’re on a desktop you can interact with it to view a single state’s flow. And of course for mobile and social media, we made the obligatory animated gif.

And just when we thought we’d finished, new data was released and were were obliged to update things for 2015.

Glowing ants

Glowing ants

Building a visualization that is both clear and engaging is hard work. Indeed, sometimes it doesn’t work at all. In this post we’ve only highlighted a fraction of the steps we took.  We also fiddled with algorithm settings, color, transparency and interactivity.  We tested out versions with net migration. We tried overlaying choropleths and comparing the migration to other variables like unemployment and birth rate. None of these iterations even made the cut for this blog post.

An intuitive, engaging, and insightful visualization is rare precisely because of how much effort it takes. We continue to believe that the effort is worthwhile.

Originally Posted at: Visualization’s Twisted Path

Feb 21, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Accuracy check  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Announcing dplyrXdf 1.0 by analyticsweekpick

>> Data Lakes: The Enduring Case for Centralization by jelaniharper

>> Social Media and the Future of Customer Support [Infographics] by v1shal

Wanna write? Click Here

[ NEWS BYTES]

>>
 Platform Virtualization Software Market Analysis, Share, Trends and Forecast 2025 – Industry Research Report 2018 – Journal of Industry Under  Virtualization

>>
 AWS Says It’s Never Seen a Whole Data Center Go Down – Data Center Knowledge Under  Data Center

>>
 CSP (CSPI) and Verint Systems (VRNT) Head-To-Head Survey – Fairfield Current Under  Social Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

Learning from data: Machine learning course

image

This is an introductory course in machine learning (ML) that covers the basic theory, algorithms, and applications. ML is a key technology in Big Data, and in many financial, medical, commercial, and scientific applicati… more

[ FEATURED READ]

Thinking, Fast and Slow

image

Drawing on decades of research in psychology that resulted in a Nobel Prize in Economic Sciences, Daniel Kahneman takes readers on an exploration of what influences thought example by example, sometimes with unlikely wor… more

[ TIPS & TRICKS OF THE WEEK]

Fix the Culture, spread awareness to get awareness
Adoption of analytics tools and capabilities has not yet caught up to industry standards. Talent has always been the bottleneck towards achieving the comparative enterprise adoption. One of the primal reason is lack of understanding and knowledge within the stakeholders. To facilitate wider adoption, data analytics leaders, users, and community members needs to step up to create awareness within the organization. An aware organization goes a long way in helping get quick buy-ins and better funding which ultimately leads to faster adoption. So be the voice that you want to hear from leadership.

[ DATA SCIENCE Q&A]

Q:How to detect individual paid accounts shared by multiple users?
A: * Check geographical region: Friday morning a log in from Paris and Friday evening a log in from Tokyo
* Bandwidth consumption: if a user goes over some high limit
* Counter of live sessions: if they have 100 sessions per day (4 times per hour) that seems more than one person can do

Source

[ VIDEO OF THE WEEK]

Surviving Internet of Things

 Surviving Internet of Things

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

In God we trust. All others must bring data. – W. Edwards Deming

[ PODCAST OF THE WEEK]

#FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

 #FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

14.9 percent of marketers polled in Crain’s BtoB Magazine are still wondering ‘What is Big Data?’

Sourced from: Analytics.CLUB #WEB Newsletter

Data Management Rules for Analytics

With analytics taking a central role in most companies’ daily operations, managing the massive data streams organizations create is more important than ever. Effective business intelligence is the product of data that is scrubbed, properly stored, and easy to find. When your organization uses raw data without proper management procedures, your results suffer.

The first step towards creating better data for analytics starts with managing data the right way. Establishing clear protocols and following them can help streamline the analytics process, offer better insights, and simplify the process of handling data. You can start by implementing these five rules to manage your data more efficiently.

1. Establish Clear Analytics Goals Before Getting Started

As the amount of data produced by organizations daily grows exponentially, sorting through terabytes of information can become problematic and reduce the efficiency of analytics. Such large data sets require significantly longer times to scrub and properly organize. For companies that deal with multiple streams that exhibit heavy bandwidth, having a clear line of sight towards business and analytics goals can help reduce inflows and prioritize relevant data.

It’s important to establish clear objectives for data and create parameters that filter out data points that are irrelevant or unclear. This facilitates pre-screening datasets and makes scrubbing and sorting easier by reducing white noise. Additionally, you can focus even more on measuring specific KPIs to further filter out the right data from the stream.

6 crucial steps of preparing data for analysis

2. Simplify and Centralize Your Data Streams

Another problem analytics suites face is reconciling disparate data from multiple streams. Organizations have internal, third-party, customer, and other data that must be considered as part of a larger whole instead of viewed in isolation. Leaving data as-is can be damaging to insights, as different sources may use unique formats or different styles.

Before allowing multiple streams to connect to your data analytics software, your first step should be establishing a process to collect data more centrally and unify it. This centralization makes it easier to input data seamlessly into analytics tools, but also simplifies the methodology for users to find and manipulate data. Consider how to set up your data streams best to reduce the number of sources to eventually produce more unified sets.

3. Scrub Your Data Before Warehousing

The endless stream of data raises questions about quality and quantity. While having more information is preferable, data loses its usefulness when it’s surrounded by noise and irrelevant points. Unscrubbed data sets make it harder to uncover insights, properly manage databases, and access information later.

Before worrying about data warehousing and access, consider the processes in place to scrub data to produce clean sets. Create phases that ensure data relevance is considered while effectively filtering out data that is not pertinent. Additionally, make sure the process is as automated as possible to reduce wasted resources. Implementing functions such as data classification and pre-sorting can help expedite the cleaning process.

4. Establish Clear Data Governance Protocols

One of the biggest emerging issues facing data management is data governance. Because of the sensitive nature of many sources—consumer information, sensitive financial details, and so on—concerns about who has access to information are becoming a central topic in data management. Moreover, allowing free access to datasets and storage can lead to manipulation, mistakes, and deletions that could prove damaging.

It’s vital to establish clear and explicit rules about who can access data, when, and how. Creating tiered permission systems (read, read/write, admin) can help limit the exposure to mistakes and danger. Additionally, sorting data in ways that facilitate access to different groups can help manage data access better without the need to give free rein to all team members.

5. Create Dynamic Data Structures

Many times, storing data is reduced to a single database that limits how you can manipulate it. Static data structures are effective for holding data, but they are restrictive when it comes to analyzing and processing it. Instead, data managers should place a greater emphasis towards creating structures that encourage deeper analysis.

Dynamic data structures present a way to store real-time data that allows users to connect points better. Using three-dimensional databases, finding methods to reshape data rapidly, and creating more inter-connected data silos can help contribute to more agile business intelligence. Generate databases and structures that simplify accessing and interacting with data rather than isolating it.

The fields of data management and analytics are constantly evolving. For analytics teams, it’s vital to create infrastructures that are future-proofed and offer the best possible insights for users. By establishing best practices and following them as closely as possible, organizations can significantly enhance the quality of the insights their data produces.

6 crucial steps of preparing data for analysis

Source

Office Depot Stitches Together the Customer Journey Across Multiple Touchpoints

In January 2017, the AURELIUS Group (Germany) acquired the European operations of Office Depot, creating Office Depot Europe. Today, Office Depot Europe is the leading reseller of workplace products and services with customers in 14 countries throughout Europe selling anything from paper, pens and flip charts, to office furniture and computer.

Centralizing Data to Respond to Retail Challenges

Traditionally, Office Depot’s European sales were primarily sourced through an offline, mail-order catalog model drive by telemarketing activities. The company has since moved to a hybrid retail model, combining offline and online shopping, which required a data consolidation strategy that optimized the different channels. Additionally, the company’s myriad of backend systems and disparate supply chain data collected from across Europe had become difficult to analyze.

Using Talend, Office Depot can now ingest data from its vast collection of operational systems. The architecture includes an on-premise Hadoop cluster using Hortonworks, Talend Data Integration, and Data Quality to perform checks and quality control on data before ingesting it into the Hub’s data lake.

Powering Use Cases from Supply Chain to Finance

Integrating online and offline data results in a unified, 360-degree view of the customer and a clear picture of the customer journey. Office Depot can now create more-specific audience segments based on how customers prefer to buy, and tailor strategies to reach the most valuable consumers whether they buy online or in-store. They can compare different offline customer experiences to see how they are influenced by digital ads. Customer service operators have complete information on a customer, so they can talk to them as they know their details.

Office Depot’s data hub approach also provides high-quality data to all back-office functions throughout the organization, including supply chain and finance. Office Depot can now integrate data from the range of supply chain back-end systems in use in various countries, and answer questions such as which distribution center has the most efficient pick-line and why; or which center is in the risky position of having the least amount of stock for the best-selling products.

The post Office Depot Stitches Together the Customer Journey Across Multiple Touchpoints appeared first on Talend Real-Time Open Source Data Integration Software.

Originally Posted at: Office Depot Stitches Together the Customer Journey Across Multiple Touchpoints

Feb 14, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Statistically Significant  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Oct 12, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> Future of Public Sector and Jobs in #BigData World #FutureOfData #Podcast by v1shal

>> State of Data Warehouse: A GigaOm Market Landscape Report by analyticsweekpick

Wanna write? Click Here

[ NEWS BYTES]

>>
 Cisco and Amazon partner on hybrid-cloud approach – MarketWatch Under  Hybrid Cloud

>>
 Lack of analytics skill leads to over 76000 empty positions – People Matters Under  Talent Analytics

>>
 How data science is shaping the modern NHS – New Statesman Under  Data Science

More NEWS ? Click Here

[ FEATURED COURSE]

CS109 Data Science

image

Learning from data in order to gain useful predictions and insights. This course introduces methods for five key facets of an investigation: data wrangling, cleaning, and sampling to get a suitable data set; data managem… more

[ FEATURED READ]

How to Create a Mind: The Secret of Human Thought Revealed

image

Ray Kurzweil is arguably today’s most influential—and often controversial—futurist. In How to Create a Mind, Kurzweil presents a provocative exploration of the most important project in human-machine civilization—reverse… more

[ TIPS & TRICKS OF THE WEEK]

Data aids, not replace judgement
Data is a tool and means to help build a consensus to facilitate human decision-making but not replace it. Analysis converts data into information, information via context leads to insight. Insights lead to decision making which ultimately leads to outcomes that brings value. So, data is just the start, context and intuition plays a role.

[ DATA SCIENCE Q&A]

Q:Is it better to spend 5 days developing a 90% accurate solution, or 10 days for 100% accuracy? Depends on the context?
A: * “premature optimization is the root of all evils”
* At the beginning: quick-and-dirty model is better
* Optimization later
Other answer:
– Depends on the context
– Is error acceptable? Fraud detection, quality assurance

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data is not information, information is not knowledge, knowledge is not understanding, understanding is not wisdom. – Clifford Stoll

[ PODCAST OF THE WEEK]

Want to fix #DataScience ? fix #governance by @StephenGatchell @Dell #FutureOfData #Podcast

 Want to fix #DataScience ? fix #governance by @StephenGatchell @Dell #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Every person in the US tweeting three tweets per minute for 26,976 years.

Sourced from: Analytics.CLUB #WEB Newsletter

The UX of Brokerage Websites

UX Brokerage Websites

UX Brokerage WebsitesBuying and selling stocks has dramatically changed since the advent of the web.

Online brokerages have made trades more accessible, faster, and dramatically cheaper for the retail investor.

Prior to the web, it was common for a full-service broker to charge 2.5% for a stock trade. Now transactions are low-cost commodities with many trades costing less than $10 (or even free).

But the ease of executing trades, access to account data, and the tsunami of financial information available online hasn’t taken away some common challenges for the typical investor.

To understand this experience, we benchmarked the user experience of six top US-based brokerage websites to understand how people are using these services and where the process can be improved.

We benchmarked the desktop and mobile user experiences of the following six brokerage websites:

  • Charles Schwab (schwab.com)
  • E*Trade (us.etrade.com)
  • Fidelity Investments (fidelity.com)
  • Merrill Lynch (ml.com)
  • TD Ameritrade (tdameritrade.com)
  • Vanguard (investor.vanguard.com)

We collected SUPR-Q data, including NPS data, and investigated reasons for using the website, users’ attitudes toward the website, and how well people understood key terms and features. We also supplemented the empirical data with a guideline review using the Calibrated Evaluator’s Guide (CEG) where evaluators score the websites using 107 standardized criteria that have been shown to impact the user experience.

Benchmark Study Details

We recruited 199 participants in August 2018 for a retrospective study where we asked current brokerage account holders to reflect on their most recent experiences using their brokerage company’s website.

Participants in the study answered the 8-item SUPR-Q (including the Net Promoter Score) and questions about their prior experiences.

Quality of the Brokerage Website User Experience: SUPR-Q

The SUPR-Q is a standardized measure of the quality of a website’s user experience and is a good way to gauge users’ attitudes. It’s based on a rolling database of around 150 websites across dozens of industries, including brokerages.

Scores are percentile ranks and tell you how a website experience ranks relative to the other websites. The SUPR-Q provides an overall score as well as detailed scores for subdimensions of trust, usability, appearance, and loyalty.

The scores for the six brokerage websites are well above average, which isn’t too surprising given the financial incentive to make transactions easy and frequent. The average SUPR-Q is at the 92nd percentile (scoring better than 92% of the websites in the database). Merrill Lynch has the lowest SUPR-Q of the group with a score at the 88th percentile. E*Trade and TD Ameritrade lead the group with scores at the 95th percentile. Brokerages as a group also scored higher than the banking websites we recently evaluated.

Usability Scores & Trust

We asked participants to reflect on how easy they thought it was to use and to navigate through their brokerage websites. Fidelity has the highest score in the group (at the 93rd percentile) and Vanguard has the lowest usability score at the 77th percentile.

Not surprisingly, trust scores for this group are also high as users are confident in the companies that handle their money. Participants rated their brokerage companies very high in trust and credibility, and all sites in our study scored between the 92nd and 99th percentiles. This is higher than our group of retail banking websites,  which had trust scores between the 76th and 95th percentiles.

Loyalty/Net Promoter Scores

The brokerage websites have an average NPS of about 41%. While we’ve seen that users are more reluctant to recommend their banking websites (with an average NPS of 16%), the same was not the case with other financial websites. This relatively high likelihood to recommend could also be from the effects of a long-term bull market and 401k retirement accounts are at all time highs[pdf]. U.S. equities are up about 40% in the prior two years leading up to the data collection period in August 2018. That is, people may be more interested in purchasing stocks, viewing mutual fund balances, and recommending to friends because of positive financial experiences (as opposed to actual differences in website changes).

Use of Brokerage Sites & Mobile Apps

As a part of this benchmark, we asked participants how they accessed their brokerage site and the activities they attempted on their last visit and in the last year. Not surprisingly, most participants use a desktop or laptop computer to access their brokerage site. However, on average, about 35% of participants reported using their mobile devices (about equally between the mobile website or mobile app).

The mobile app is most popular for checking portfolios while the desktop site is used for tasks such as investment research and transferring funds. The brokerage mobile app usage is lower than the banking app usage, which showed significantly more mobile app and mobile website usage (46% and 64%, respectively).

Overall, checking account balances is the top reason to visit the brokerage sites (27%), followed by looking at the recent performance of stocks and investments (24%). The top tasks were the same across desktop and mobile. More details on the mobile app experience are available in the report.

The Learning Curve & Understanding Jargon

Users on some of the brokerage sites (Charles Schwab, E*Trade, and Vanguard) mentioned a steep learning curve at the start of use, especially with complex jargon.

  • “To the uncertain investor they have a lot of information that is complex and it can be very overwhelming looking at all of the investing lingo they have on their website. Such as looking at their EFTS, bonds, stocks, 401k, and IRA options—there are a lot of things that you can choose from and getting started is complex.”—Vanguard user
  • “Honestly, my only problems were at the beginning, when I was still learning their site’s features and layout.”—Charles Schwab user
  • “I was a little lost and there wasn’t a lot of guidance. I had to do a lot of my own research to get started.”—E*Trade user

To further assess how participants understood common jargon, we asked participants to describe three common brokerage terms in their own words. A YTD return was the most widely known term (91%), whereas only 64% of participants knew what a prospectus was, and 73% knew what an ETF was. This comprehension was generally higher than the health insurance websites.

Calling Customer Service

As with banking websites, taking calls is expensive and time consuming. On average, 13% of brokerage website respondents reported contacting customer service in the last year (lower than the 20% on banking websites and 30% for health insurance websites). Being able to accomplish the task without calling customer service was one of the key drivers of UX quality; it explained 8% of the variation in SUPR-Q scores.

Also, like the banking websites, the top reason for brokerage respondents to call customer support is login troubles. However, this only accounted for 5% of participants calling in (compared to 29% for banking and 21% for health insurance). Other common reasons for calling in were researching particular products (stocks, bonds, ETFs) (5%), and transferring funds (3%).

Having an online chat system can help reduce the number of calls to customer support and it’s something our respondents were looking for.

“I wish they had a way to contact customer service other than by phone. I prefer using email or chat to contact customer service so that I can have text to refer back to if I’m trying to get help with something, but they only seem to allow phone contact.”—Vanguard user

In the guideline review (more info can be found in the report), Vanguard has the lowest scores on Help & Information partly because it doesn’t have an online chat option.

Product Pages Challenging

While we didn’t conduct a usability evaluation for this industry, to identify problem areas, we did conduct a detailed guideline review using the CEG. The top weakness identified by evaluators on both Merrill Lynch and Fidelity was the organization of the product pages.

For Merrill Lynch, the website lacks a clear path to finding investment products as the information is not organized effectively. The page seems to lack an element hierarchy or sidebar to narrow choices into products like ETFs, Retirement Accounts, Stocks, or Mutual Funds.

 “It’s a little hard to navigate unless you know exactly what you’re looking for. Otherwise, it takes time to navigate and look at all of their options.”—Merrill Lynch user

Image 1 Product pages form Merril Lynch

Image 1: Product pages form the Merrill Lynch website shows a lack of hierarchy on the page.

A similar challenge is on the Fidelity website. While it has a better visual arrangement, the vast number of products are hard to differentiate for the typical retail investor as Fidelity has no apparent overall product comparison page to understand and differentiate the financial products.

Image 2 fidelity investments products

Image 2: There’s no overall investment products page where users can look at products with descriptions from a high-level view.

Image 3 Users must navigate to each individual product page

Image 3: Users must navigate to each individual product page to get any information, which can seem tedious.

The websites that performed well in the product pages category, like E*Trade, have an overview of the investment products with short descriptions and an option to learn more on a detailed product page.

Image 4 E-trade different product options

Image 4: E*Trade makes it easy for users to quickly digest the different product options and see where they want to dive deeper.

Key Drivers

To better understand what’s affecting SUPR-Q scores, we asked respondents to agree/disagree to the following attributes of their brokerage website on a five-point scale from 1=strongly disagree to 5=strongly agree.

  1. My portfolio dashboard is clear and understandable.
  2. I can buy or sell stocks and bonds easily.
  3. The [Brokerage] site is consistent however I use it.
  4. I can find and understand my investment fees and taxes.
  5. I can accomplish what I want to do without calling customer support.
  6. The [Brokerage] site helps me to research and understand investment products.
  7. The [Brokerage] website keeps my information secure.
  8. I can find my account details quickly and securely.

Information Security & Dashboards

The top key driver for the brokerage website user experience is keeping user information safe and secure. “The brokerage website keeps my information secure” explained 16% of the variation in SUPR-Q scores.

The second biggest driver of the perception of the website UX quality is having a clear portfolio dashboard.

Clear dashboards explain 12% of the variation in SUPR-Q scores. Checking portfolio balances is also a top task for this group, so it’s not surprising this is found to be so important to the quality of the brokerage website experience.

The six sites we tested all scored high for the item “My portfolio dashboard is clear and understandable” (Item 1). The lowest score is on TD Ameritrade where only 88% agreed or strongly agreed compared to the highest for Vanguard and Charles Schwab (both 94%).

Summary

An analysis of the user experience of six brokerage websites found:

1. Brokerage websites offer above average UX across devices. Current users find their brokerage website experience well above average, with SUPR-Q scores falling at the 92nd percentile. Merrill Lynch scored the lowest but still above average at 88% while E*Trade and TD Ameritrade lead the pack at the 95th percentile. Mobile apps are widely used in this industry with 35% of respondents reporting using the app or website within the past year.

2. Users need to overcome a steep learning curve and overwhelming product pages. Respondents on half of the brokerage sites (Charles Schwab, E*Trade, and Vanguard) mentioned a learning curve when first using the website. Many said the brokerage sites didn’t lend themselves to novice investors and some jargon was confusing to learn at first. Improved on-boarding and more common language could improve the experience on these sites. The product pages, in particular, may be difficult for less experienced users to differentiate and select.

3. An understandable dashboard and security are key drivers of UX. The top two key drivers of the brokerage website user experience were an understandable portfolio dashboard (12%) and information security (16%). Users want a quick and easy view of their portfolio and they don’t want to worry about their personal information being compromised (they probably also want their portfolio balances to increase).

Full details are available in the downloadable report.

 

(function() {
if (!window.mc4wp) {
window.mc4wp = {
listeners: [],
forms : {
on: function (event, callback) {
window.mc4wp.listeners.push({
event : event,
callback: callback
});
}
}
}
}
})();

Sign-up to receive weekly updates.

Originally Posted at: The UX of Brokerage Websites

Hilarious 12 tweets from 2012 VP Debate

Hilarious 12 tweets from 2012 VP Debate

From Joe Biden’s smirks, smiles, laughs, sharp elbows and impolite interruptions to Paul Ryan’s focussed and serious attitude, 2012 VP debate was all fun and informative. Both the sides were trying to be reasonable and focussed. Along the way we got an ocean of hilarious tweets that emerged and made us laugh.

Here is my list of 12 tweets that stood out in yesterday’s debate:

Patton Oswalt — Ryan is a nervous Walmart manager. Biden is an irate customer with the receipt, the warranty & he’s friends w/ the store owner. #debate

Josh Branson — BREAKING: Post-debate poll has … Biden interrupting the results. #vpdebate

Bill Maher ‏‪— Hello 9 1 1? There s an old man beating a child on my tv

GuyEndoreKaiser — Tonight’s debate is scheduled for ninety minutes, but Paul Ryan is already claiming he can finish in fifty something.

Kelly Oxford — While Ryan speaks, Biden looks like he’s trying to order a drink at the bar and the bartender is ignoring him. #vpdebate

Jill Morris ‏‪– The VP candidates get to sit because they’re exhausted from standing up for our values. ‪#VPDebate

Paul Ryan Gosling ‏‪– Hey girl, I’m not taking nervous sips of water, I’m drinking every time Biden laughs at me. ‪#vpdebate

Jose Antonio Vargas ‏‪– Before this ‪#VPDebate, ‪@JoeBiden had a Venti macchiato and two cans of Red Bull.

Morgan Murphy ‏‪– Biden’s teeth are so white they’re voting for Romney. ‪#VPDebate

hilarious 12 tweets from 2012 VP debateIndecision ‏‪– Watered-down sanctions are the worst. You need four just to get tipsy. ‪#vpdebate

James Garrett ‏‪– I kind of feel like Joe Biden is Kanye and Paul Ryan is Taylor Swift. ‪#VPDebate

National Review ‏‪– Wait, is Biden yelling at Martha Raddatz right now? I thought he was debating Paul Ryan…‪#VPDebate

Jeffrey Wisenbaugh ‏‪– Biden is yelling less. I think it’s getting closer to his bed time. ‪#sleepy ‪#VPDebate

Chad Schomber ‏‪– To think, all this just to sway 4-6% of undecided voters. And those folks are not watching the ‪#VPdebate

Seth Masket ‏‪– Actually, this is like if Aaron Sorkin wrote an exchange between the Skipper and Gilligan. ‪#vpdebate

*image source RRStar

Originally Posted at: Hilarious 12 tweets from 2012 VP Debate by v1shal

Feb 07, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Correlation-Causation  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Why Is Big Data Is So Big In Health Care? by analyticsweek

>> Top 5 Lessons LEGO story teaches an entrepreneur by v1shal

>> Voices in Data Storage – Episode 3: A Conversation with Leo Leung of Oracle by analyticsweekpick

Wanna write? Click Here

[ NEWS BYTES]

>>
 Smarter AI: Machine learning without negative data – Science Daily Under  Machine Learning

>>
 Data center security company A10 Networks could be eyeing a sale – DatacenterDynamics Under  Data Center

>>
 How tech and data can spot and stop the quitters – Raconteur Under  Sentiment Analysis

More NEWS ? Click Here

[ FEATURED COURSE]

Deep Learning Prerequisites: The Numpy Stack in Python

image

The Numpy, Scipy, Pandas, and Matplotlib stack: prep for deep learning, machine learning, and artificial intelligence… more

[ FEATURED READ]

The Industries of the Future

image

The New York Times bestseller, from leading innovation expert Alec Ross, a “fascinating vision” (Forbes) of what’s next for the world and how to navigate the changes the future will bring…. more

[ TIPS & TRICKS OF THE WEEK]

Data Have Meaning
We live in a Big Data world in which everything is quantified. While the emphasis of Big Data has been focused on distinguishing the three characteristics of data (the infamous three Vs), we need to be cognizant of the fact that data have meaning. That is, the numbers in your data represent something of interest, an outcome that is important to your business. The meaning of those numbers is about the veracity of your data.

[ DATA SCIENCE Q&A]

Q:What do you think about the idea of injecting noise in your data set to test the sensitivity of your models?
A: * Effect would be similar to regularization: avoid overfitting
* Used to increase robustness

Source

[ VIDEO OF THE WEEK]

@DrewConway on creating socially responsible data science practice #FutureOfData #Podcast

 @DrewConway on creating socially responsible data science practice #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

The most valuable commodity I know of is information. – Gordon Gekko

[ PODCAST OF THE WEEK]

@SidProbstein / @AIFoundry on Leading #DataDriven Technology Transformation #FutureOfData #Podcast

 @SidProbstein / @AIFoundry on Leading #DataDriven Technology Transformation #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Data is growing faster than ever before and by the year 2020, about 1.7 megabytes of new information will be created every second for every human being on the planet.

Sourced from: Analytics.CLUB #WEB Newsletter

10 Analytics Blog Posts to Read Before 2019

It should come as no surprise that 2018 was a crazy and explosive year for BI and analytics. As a market that changes on what feels like a daily basis, there’s always a new technology to learn about or a new way to up your analytics game. With such a fast-paced industry, there’s never a lack of things to write about and here at Sisense, we take it upon ourselves to be a one-stop shop for all the analytics content and learning you need.

With that in mind, it’s time for one of my favorite posts of the year! Here’s a wrap up of the top 10 blogs that you shared the most over the past 365 days (based on data, of course).

Let’s get down to it!

10.Gartner Grapevine 2018 Wrap-Up: Day One


This year Gartner Grapevine was jam-packed with impactful sessions and tons of analytics knowledge. In his day one wrap-up our VP Product, Boaz Farkash, talks about his key takeaways for BI leaders to address in order to scale value for their customers. Spoiler alert: it has to do with a fundamental shift in the market of BI turning to AI.

9. Data Science vs. Data Analytics – What’s the Difference?

data science vs. data analytics

Often mistaken for one another, it can be confusing to differentiate between data analytics and data science. Despite the two being interconnected, they provide different results and pursue different approaches. We break down the differences in this post.

8. How To Get More From Your Data With Embedded Analytics


Embedded Analytics, or analytics capabilities bundled into business applications, are making a dramatic foray. If you’re not already using embedded analytics in your organization, this post will help you understand what data you’re leaving on the table.

7. The Unexpected Connections Between Bitcoin and The Dow

bitcoin vs the dow

Part of our new series, GoFigure!, this post and the accompanying analysis report and interactive dashboard dig through data to see if there are any connections between Bitcoin and The Dow. Our findings might surprise you.

6. Going Embedded: The Pillar of Analytics Success

embedded analytics

Sure, you can build your own analytics solution. But, by the time it’s ready to launch your competitors will be way ahead of you. Crafted for the R&D professional, this post breaks down what functionalities you should consider when embedding analytics.

5. Retail Predictive Analytics – How to Use Predictive Analytics in Retail

Retail Analytics

Few fields may be as optimized for predictive analytics compared to retail. In a field where businesses succeed by effectively uncovering what customers will like next, predictive analytics can be the difference between a strong revenue stream and a dwindling sales pool.

4. It’s International Women’s Day…Here Are 17 Badass Women Working in Data

women in data

Although every day is a day to celebrate the women who are shaping the data world, on this year’s International Women’s day we celebrated 17 of the most awesome ladies out there championing the data and analytics cause. If you don’t know these women, you should!

3. The Best Big Data Applications for Financial Services


Today’s financial service providers operate almost entirely online, and every single transaction and penny transmitted creates hundreds of data points. This post details how financial service organizations can find the right data streams and KPIs.

2. Refugee migration: Where are people fleeing from and where are they going?

Refugee Data

Millions of people live as refugees and individuals seeking asylum. Where are they fleeing and where are they seeking sanctuary? In this edition of GoFigure!, our blog post, original analysis, and interactive dashboard take a look at data from the World Bank and reveal insights on the world refugee crisis.

1. The Next Generation of BI is Here… And So Is The 2018 Gartner Magic Quadrant for Analytics and Business Intelligence Platforms


Every year BI vendors wait on pins and needles to see where they’ll be placed in the Gartner Magic Quadrant for Analytics and Business Intelligence Platforms. This year’s top blog post announces our exciting position (Visionaries!) as well as what our VP Strategic Growth and Innovation, Guy Levy Yurista, PHD, thinks the Quadrant means for the analytics market.

Thanks for being loyal readers. Happy 2019!

Source: 10 Analytics Blog Posts to Read Before 2019