Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Save more on your purchases! discount-offer-chevron-icon
Savings automatically calculated. No voucher code required.
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletter Hub
Free Learning
Arrow right icon
timer SALE ENDS IN
0 Days
:
00 Hours
:
00 Minutes
:
00 Seconds

Tech News - Data

1209 Articles
Matthew Emerick
17 Sep 2020
4 min read
Save for later

Continuing to invest in our unique approach to Data Management from What's New

Matthew Emerick
17 Sep 2020
4 min read
Tracy Rodgers Senior Product Marketing Manager Hannah Kuffner September 17, 2020 - 5:33pm September 17, 2020 It’s hard to believe that a year ago today, we launched Tableau Catalog, part of our Data Management offering. It was the first time our customers were really able to see our unique vision for self-service data management in practice. At Tableau, we believe that Data Management is something that everyone, both IT and the business, should benefit from—and we built an integrated solution that reflects that belief. And since launch, we’ve continued to deliver more features and more value to organizations of all sizes. Tableau Catalog provides data quality warnings, helpful metadata, lineage, and more—all in the context of analysis. Self-service requires trust, visibility, and governance Data Management solutions have been in the market for 30+ years. However, they’ve always focused on solving data management problems from the perspective of IT. This can lead to data being duplicated, mistrusted, or too locked down. In today’s world, we have more data, in more formats, at different levels (from enterprise to departmental to individual data needs), and in more places than ever before. At the same time, more people are expected to use data to do their jobs effectively and efficiently. This is where we’re seeing a gap. Everyone in an organization needs to be able to find, access, and trust the right data, so that they can do their jobs effectively—while using secure data, responsibly. There are solutions to address data preparation, cataloging, and analytics. And while they are extremely valuable in their own right and address specific problems, there aren’t many vendors that bring it all together. Tableau has created a convergence for data management—helping solve the most prominent challenges all under one platform with a successful data management experience, regardless of who the user is. That’s been our vision even before we released Tableau Catalog last September. Now, we’re seeing people leverage it. With Tableau Data Management, organizations are providing more visibility and trust in data analysis than ever before. Whether it’s a CEO reviewing their finance dashboard or the data steward doing maintenance on their server, everyone can better understand what the data means, where it’s coming from, and whether it’s up to date or not. Where enterprise data catalogs help provide a comprehensive catalog of the data in your ecosystem, Tableau focuses on the analytics catalog, giving people the information they need when and where they need it—directly in the flow of their analysis. Lake County Health Department is one such organization that has not only re-examined their data strategy as a result of data management, but they’ve also scaled how to better educate their consumers in a timely manner when having accurate data at your fingertips is more important than ever. We’ve seen both enterprise and commercial businesses use Tableau Data Management to help identify PII (personal identifiable information) data and remove it from their business processes. Internally, at Tableau, we’ve used Tableau Catalog to help us efficiently move from one database to another, while having a minimal impact on the enterprise at large. There’s plenty more to come for Tableau Data Management Tableau is dedicated to helping people take advantage of what it means to do self-service data management—and this means continuing to innovate so that our offering supports more use cases. With our 2020.3 release, we introduced the capability to write to external databases from Tableau Prep, which allows organizations to leverage their dollar and governance policy investments, while providing a single platform to go from data prep to analytics, seamlessly. And we’ve heard great feedback from customers that influenced updates we’ve already implemented. For example, we introduced high-visibility data quality warnings in 2020.2 when we heard that the default data quality warnings weren’t catching users’ attention. Tableau Prep’s write to database capability is now available with Tableau 2020.3. In upcoming releases, we have some even bigger product announcements related to Data Management—you aren’t going to want to miss them! Make sure to tune into the Tableau Conference-ish keynote on Tuesday, October 6th to hear about where we’re taking Data Management next. Register today—it’s completely free!
Read more
  • 0
  • 0
  • 4229

article-image-election-2020-how-data-can-show-whats-driving-the-trump-vs-biden-polls-from-whats-new
Matthew Emerick
14 Oct 2020
8 min read
Save for later

Election 2020: How data can show what’s driving the Trump vs. Biden polls from What's New

Matthew Emerick
14 Oct 2020
8 min read
Steve Schwartz Director, Public Affairs at Tableau Tanna Solberg October 14, 2020 - 5:19pm October 14, 2020 It’s October, which means there is officially less than one month until the 2020 Presidential election on November 3. Opinion polls on the race between current President Donald Trump and the challenger, former Vice President Joe Biden, are everywhere.  Although many people see public opinion polls as a way to anticipate the outcome of the election, they are most valuable when considered as a snapshot of people's beliefs at a given moment in time. Through our partnership with SurveyMonkey and Axios to collect and share data on the 2020 Presidential race, we’ve created a dashboard where you can track how survey respondents are feeling about the candidates. But looking at candidate preference data alone doesn’t answer the critical question of this years’ election: What is driving voter preference? This year, that’s an especially tricky question. There are the major issues confronting the country this year—from challenges like the COVID-19 pandemic, to the disease’s impact on the national and global economies, to the nationwide protests for racial justice and equity. And there’s also the news cycle which seemingly tosses another knuckleball at voters before they’ve had a chance to process the last one. By partnering with SurveyMonkey, we’ve been able to tap into their vast market research technologies to reach the public and visualize their answers to these critical questions. Through our Election 2020 platform, you can dig into this data and expand your understanding of not only what the topline polls are saying, but what is top-of-mind for the voters making the decision this year. We’ll walk you through some of the key data you can find on our Election 2020 pages, and why it’s so critical to understanding this year’s political landscape. Preferences by demographics Understanding the way different demographic groups vote is critical. It’s very common for pollsters to break down results by categories like age bracket, race, and gender: Disaggregating data offers valuable insights into trends among voter groups that can inform understanding of potential election results. But the way the data is often presented—either in static crosstabs, or in percentage points scattered throughout an analysis—doesn’t really give people the insight into voters’ intersectionalities, and how they play out in the data. SurveyMonkey wanted to give people a way to explore demographic data in a more granular and comprehensive way. They’ve broken down data on candidate preference by five different demographic categories—age, race, gender, education level, and party ID—and in a Tableau dashboard, anyone can choose which categories to combine to see more nuanced voter preferences. For instance, if one were to just look at gender, the breakdown would be pretty clear: 52% of men support Trump, and 56% of women support Biden. But in this dashboard, you can also choose to layer in race. Suddenly, the picture becomes much more complex: 87% of Black women support Biden, and 76% of Black men support Biden. On the flip side, just 38% of white men support Biden, and 44% of white women support Trump. If you add in another dimension, like education, the numbers become even clearer: By far, the group that most strongly supports Trump (at 70%) is white men with a high school degree or less, and the group with the strongest across-the-board support for Biden is Black women with a postgraduate degree (91%). "From the perspective of someone who's immersed in crosstabs and bar charts every day, this visualization is the clearest example yet of the value of pairing data collected through SurveyMonkey's mighty scale with visual storytelling tools from Tableau. The fact that it's highly interactive and responsive really brings the data to life in a way that isn't possible using standard tools,” Wronski says. The COVID-19 pandemic Let’s start with the big one. COVID-19 has posed one of the most significant challenges to the United States and its citizens in recent memory. Over 200,000 people have died, and the economy has recorded its steepest-ever drop on record, with the GDP declining more than 9%. As we near the Election, the virus is not showing signs of abating (for the latest data on COVID-19, you can visit Tableau’s COVID-19 Data Hub). Our partners at SurveyMonkey have been tracking public sentiment around the pandemic since February, as it’s impacting the lives of nearly everyone in the United States this year. "The coronavirus pandemic has infiltrated every aspect of life for the past eight months, and it will continue to do so for the foreseeable future. We wanted to make sure to start measuring concerns early on, and we're committed to tracking public sentiment on this topic for as long as necessary,” says Laura Wronski, research science manager at SurveyMonkey. Through our Election 2020 portal, you can analyze data on how the public is feeling about the pandemic in the leadup to the election. SurveyMonkey has asked respondents about their personal concerns around the virus—if they are worried about contracting it themselves, or someone in their family being affected, and if they are worried about the pandemic’s impact on the economy. Because SurveyMonkey’s Tableau dashboards make it easy to filter these responses by a number of demographic factors—from age to political affiliation—you can begin to see patterns in the data, and understand how concerns around COVID-19 could be a key factor in shaping the outcome of the election. Government leadership Elections are nearly always a referendum on leadership, and this year is no different. However, the pandemic is adding a new layer to how voters assess their elected leaders across the country. "As the election approaches, politicians who are on the ballot at every level will be judged by how well they responded to the coronavirus this year, both in terms of its effect on the economy through lost jobs and shuttered businesses and in terms of the public health infrastructure's response,” Wronski says. Digging into the data, you can see virtually no difference along any demographic breakdown between people’s assessment of Trump as a leader overall, and people’s opinion of how he is handling the federal response to COVID-19. That can tell you several things: That voters’ opinions are, at this point, fairly solidified, and also that COVID-19 is a significant driver of that opinion. Digging into the data on how respondents feel about their state government’s response to COVID-19 shows some interesting trends. The clearest split, in many states, seems to be along party lines. In Pennsylvania, for instance, 82% of Democrats approve of the state response, while 71% of Republicans disapprove. In South Carolina, 73% of Republicans approve of the response, and 74% of Republicans disapprove. It gets much more interesting along other demographic lines, though. Here’s the opinion split along gender lines in Pennsylvania: 60% of women approve, and 49% of men disapprove. And in South Carolina, 54% of women disapprove, and 54% of men approve. "Like so much else these days, Republicans and Democrats are split in their views of how worrisome the coronavirus is and how well we've responded to it. Those partisan effects far outweigh any differences by age, gender, race, or other demographic characteristics,” Wronski says. Voting COVID-19 has complicated nearly every aspect of the 2020 Election, including voting. Multiple news outlets are reporting a sharp uptick in requests to vote by mail this year, due to concerns about gathering in public amid a pandemic. But through their data on how likely people are to vote by mail, SurveyMonkey is able to show a clear split along party lines. Overall, 70% of Democrat respondents say they’re likely to vote by mail, and 72% of Republican respondents say the opposite. Axios, our media partner in our Election 2020 initiative, has analyzed what this means in the context of the potential outcome, and what the implications could be if mail-in ballots are disqualified due to complications with the system. "More people will vote by mail in this election than in any previous election, and that will reshape the logistics of the electoral tallying process and the entire narrative that we see on the news on Election Day. It's important for us to understand those dynamics early on so that we can help explain those changes to the public,” Wronski says. Exploring with data Now that you have a sense of the information SurveyMonkey is polling for and why—and how to discover it in Tableau—we hope you take some time to dig into the data and gather your own insights. As the election nears, SurveyMonkey, Tableau, and Axios will continue to deliver more data and analysis around the political landscape, so make sure you keep checking back to the Election 2020 page for the latest.
Read more
  • 0
  • 0
  • 4218

Matthew Emerick
07 Oct 2020
1 min read
Save for later

Announcing improved MDX query performance in Power BI from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
07 Oct 2020
1 min read
With a recent update of the Analysis Services Tabular engine in Power BI, Multidimensional Expressions (MDX) clients, such as Microsoft Excel, can now enjoy improved query performance against datasets in Power BI.
Read more
  • 0
  • 0
  • 4212

article-image-thank-stanford-researchers-for-puffer-a-free-and-open-source-live-tv-streaming-service-that-uses-ai-to-improve-video-streaming-algorithms
Natasha Mathur
18 Jan 2019
2 min read
Save for later

Thank Stanford researchers for Puffer, a free and open source live TV streaming service that uses AI to improve video-streaming algorithms

Natasha Mathur
18 Jan 2019
2 min read
A team of researchers from Standford University launched a new, free, and open source TV streaming service called Puffer, as part of their non-profit academic research study. It is led by Francis Yan, a doctoral student, Computer Science, Stanford University, Sadjad Fouladi, Hudson Ayers, and Chenzhi Zhu. Puffer uses machine learning to improve video-streaming algorithms. “We are trying to figure out how to teach a computer to design new algorithms that reduce glitches and stalls in streaming video (especially over wireless networks and those with limited capacities, such as in rural areas),” say the researchers. Puffer is mainly focused on three algorithms, namely, “congestion-control” (decides when to send each piece of data), “throughput forecasters” (predicts how long it will take to send a certain amount of data over an Internet connection), and “adaptive-bitrate” (ABR) (algorithms that decide what quality of video to send for best picture quality). The project is limited to only 500 participants at a time. Participants would need to watch TV channels on Puffer and stream them over their Internet connections a few hours each week. As the participants are streaming the TV channels on the Puffer website, it will begin to automatically experiment with different algorithms to control the timing and quality of video sent to them. They will then analyze how the resulting computer-designed algorithm performs and work. Puffer is a free service and doesn’t show any ads. Puffer is capable of only re-transmitting the free over-the-air broadcast TV signals and allows streaming of up to six TV stations. These include CBS (KPIX 5), NBC (KNTV 11), ABC (KGO 7), FOX (KTVU 2), PBS (KQED 9), and Univision (KDTV 14). The  Puffer project has received funding in part by the NSF and DARPA. It has also received support from Google, Huawei, VMware, Dropbox, Facebook, and the Stanford Platform Lab. “Puffer is unique from previous academic studies...we hope that this approach will produce substantial benefits over prior work, but only time will tell”, say the researchers. For more information on Puffer, check out its official website. Researchers introduce a machine learning model where the learning cannot be proved Researchers release unCaptcha2, a tool that uses Google’s speech-to-text API to bypass the reCAPTCHA audio challenge Researchers design ‘AnonPrint’ for safer QR-code mobile payment: ACSC 2018 Conference
Read more
  • 0
  • 0
  • 4192

Matthew Emerick
23 Sep 2020
1 min read
Save for later

Answering your questions around the new Power BI Premium per user license from Microsoft Power BI Blog | Microsoft Power BI

Matthew Emerick
23 Sep 2020
1 min read
Today’s announcement by Arun around the introduction of a Premium per user license option has generated a lot of interest and excitement in the Power BI community. It has also generated a lot of questions, so we’ve put together this blog post to answer some of the most common ones we’ve seen.
Read more
  • 0
  • 0
  • 4128

Anonymous
24 Sep 2020
4 min read
Save for later

Data & insights for tracking the world’s most watched election from What's New

Anonymous
24 Sep 2020
4 min read
Data is how we understand elections. In the leadup to major political events, we all become data people: we track the latest candidate polling numbers, evaluate policy proposals, and look for statistics that could explain how the electorate might swing this year. This has never been more true than during this year’s U.S Presidential election. Quality data and clear, compelling visualizations are critical to understanding the polls, and how the sentiment among voters could influence the outcome of the race. Today, Tableau is announcing a new partnership with SurveyMonkey and Axios to bring exclusive public opinion research to life through rich visual analytics. Powered by SurveyMonkey’s vast polling infrastructure, Tableau’s world-class data visualization tools, and Axios’ incisive storytelling, this resource will enable anyone to delve into of-the-moment data and make discoveries. In the leadup to the election, SurveyMonkey will poll a randomly selected subset of the more than 2 million people who take a survey on their platform every day, asking a wide range of questions, from election integrity to COVID-19 concerns to how people will vote. “It’s never been clearer that what people think matters,” says Jon Cohen, chief research officer at SurveyMonkey. “With people around the world tuned into the U.S. presidential election, we’re showcasing how American voters are processing their choices, dealing with ongoing devastation from the pandemic, managing environmental crises, and confronting fresh challenges in their daily lives.” The results from SurveyMonkey’s ongoing polls will be published in interactive Tableau dashboards, where anyone will be able to filter and drill down into the data to explore how everything from demographics to geography to political affiliation play into people’s opinions. “To understand public views, we need to go beyond the topline numbers that dominate the conversation,” Cohen adds. “The critical debate over race and racial disparities and deeply partisan reaction to the country’s coronavirus response both point to the need to understand how different groups perceive what’s happening and what to do about it.” As a platform purpose-built for helping people to peel back layers of complex datasets and gain insights, Tableau provides visitors a compelling avenue into better understanding this year’s pre-election landscape. “People need reliable, well designed data visualizations that are easy to understand and can provide key insights,” says Andy Cotgreave, Tableau’s Director of Technical Evangelism. “As Americans make their decisions ahead of the election, they need charts that are optimized for communication. Tableau makes it possible to quickly and easily build the right chart for any data, and enable people to understand the data for themselves.” Alongside the dashboards, Tableau and SurveyMonkey experts will contribute pointers on visualization best practices for elections, and resources to enable anyone to better work with survey data. And Axios, as the exclusive media partner for this project, will incorporate the data and visualizations into their ongoing analysis of the political landscape around the election. At Tableau, we believe that data is a critical tool for understanding complex issues like political elections. We also believe that where data comes from—and how it's understood in context—is essential. Through our partnership with SurveyMonkey and Axios, we aim to provide visitors with an end-to-end experience of polling data, from understanding how SurveyMonkey’s polling infrastructure produces robust datasets, to seeing them visualized in Tableau, to watching them inform political commentary through Axios. Data doesn’t just answer questions—it prompts exploration and discussion. It helps us understand the complex issues shaping our jobs, families, and communities. Helping people see and understand survey data can bring clarity to important issues leading up to the election and lets people dig deeper to answer their own specific questions.
Read more
  • 0
  • 0
  • 4098
Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at $19.99/month. Cancel anytime
article-image-snap-machine-learning-46x-faster-tensorflow-ml-library-ibm
Savia Lobo
22 Mar 2018
2 min read
Save for later

Snap machine learning: The 46x faster than TensorFlow ML library by IBM

Savia Lobo
22 Mar 2018
2 min read
IBM claims that its new Snap Machine learning library is 46x faster than TensorFlow. IBM’s Snap Machine Learning (Snap ML) is an efficient, scalable machine-learning library that enables very fast training of generalized linear models. IBM demonstrated that this new library can eliminate the training time as a bottleneck for machine-learning workloads, paving the way to a range of new applications. Much recently, the Snap ML library set a new benchmark by outperforming training time for ML models 46 times faster than TensorFlow. Source : IBM Research By using the online advertising dataset released by Criteo Labs, which includes more than 4 billion training examples, IBM trained a logistic regression classifier in 91.5 seconds. Prior to this, the best result for training the same model was bagged by TensorFlow, which trained the same model in 70 minutes on Google Cloud Platform. Snap ML library, allows more agile development faster and more fine-grained exploration of the hyper-parameter space enables scaling to massive datasets makes frequent retraining of models possible in order to adapt to events as they occur Source: Snap Machine Learning Research paper Let’s take a look at the three distinct features of Snap ML library: Distributed training: This feature helps in building a data-parallel framework. This allows one to scale out and train on massive datasets that exceed the memory capacity of a single machine, which is crucial for large-scale applications. GPU acceleration: Implementation of specialized solvers designed to leverage the massively parallel architecture of GPUs while respecting the data locality in GPU memory in order to avoid large data transfer overheads. Sparse data structures: Many machine learning datasets are sparse, therefore some new optimizations have been enrolled for the algorithms used in IBM’s own system when applied to sparse data structures. Read more about this exciting news in detail on IBM Research.
Read more
  • 0
  • 0
  • 4075

article-image-an-inside-look-at-tableau-virtual-training-from-whats-new
Anonymous
01 Dec 2020
4 min read
Save for later

An inside look at Tableau Virtual Training from What's New

Anonymous
01 Dec 2020
4 min read
Kristin Adderson December 1, 2020 - 1:12am December 1, 2020 Virtual training is something I’m very passionate about because I’ve experienced firsthand how powerful it can be. But it recently occurred to me that if you’ve never taken any virtual training, it’s likely you don’t fully understand what it is. Is it eLearning? Pre-recorded webinars? It isn’t very clear. In our rapidly changing digital world, many learning offers are ‘on-demand,’ and virtual training refers to any learning that can be done online. I’d like to give you a behind-the-scenes look at what you can expect in a Tableau virtual training. When you attend a Tableau virtual training, you get the same in-depth content found in our in-person classrooms, delivered by an exceptional instructor. But the similarities end there. Don’t get me wrong; I love our in-person classes, but the last time I attended an 8-hour, all-day training, it wiped me out. Learning new content for 8 hours a day is exciting but exhausting. In contrast, our virtual training is delivered via Webex and scheduled over a week, typically for 2 ½ hours a day (this varies per class). As an instructor, it’s always so rewarding to see my students progress during the week. I love building rapport with them and seeing them connect with the product and with each other. Students still make connections in a virtual classroom—with classmates across the globe, instead of across the room. A question from someone in Tokyo can inspire someone who is from Denmark. Virtual training averages about ten attendees per session, providing students with the classroom feel and benefits of learning from each other. Activities are scattered throughout the training sessions, providing hands-on practice opportunities to apply what you’ve learned. By having an instructor present, you get your questions answered before you can get stuck. Additional practices are assigned as homework to encourage further exploration. So how are questions handled during virtual training? You simply ask your questions through your audio connection on your computer. We recommend using a headset with a microphone so your questions can be heard clearly. There's a chat option for those who don’t want to speak up in class so you can address your question directly to the instructor or the entire class. Get stuck on an issue while doing homework? Bring your questions to class the next day to discuss during the homework review period. Or, email the instructor who will be happy to guide you in the right direction. Questions aren’t a one-way street in Tableau virtual training. The instructor will use a variety of methods to engage with each attendee. Classes kick off with the instructor and students introducing themselves. This helps build a community of learning and makes it easier to interact with the class when you know a little about everyone. The instructor encourages a high level of engagement and will ask the class questions to track their understanding of the material. Responses can be given through audio, chat, or icons. Polls are sometimes used to validate understanding. All Tableau classroom training is available in a live virtual format. Our most popular classes (Desktop I and Desktop II) are offered in English, Spanish, Portuguese, French, German, and Japanese. Get the virtual advantage Virtual training holds a special place in my heart. Speaking as someone with many years of personal experience with virtual training, I still appreciate what it offers. To recap, here are five reasons why you should consider virtual training: Same content, only more digestible. Virtual training contains the same content as our in-person classes but broken into smaller segments. Learn a little at a time, absorb and apply the concepts, and come back the next day for more. Available anytime, anywhere. Virtual training provides the flexibility to attend classes in multiple time zones and six different languages. As long as you have a strong internet connection, you are good to go. Less disruptive to your daily schedule. Virtual training makes it easy to get that valuable interaction with a live instructor without even having to leave your office (or your house, for that matter). Real-time feedback. No need to struggle on your own. Ask the instructor and other attendees questions, understand different use cases, and get the guidance you need while doing hands-on activities. More practice makes perfect. There’s plenty of hands-on practice time both during class and with extra homework. “Time flies when you’re having fun with data” is an excellent way to describe the Tableau virtual training experience. If you’re looking for a flexible and fun way to gain valuable Tableau knowledge, go virtual. Take it from me, a virtual training veteran. You’ll never look at learning the same way again. Find a virtual class today!
Read more
  • 0
  • 0
  • 4063

article-image-instaclustr-releases-three-open-source-projects-for-apache-cassandra-database-users
Sugandha Lahoti
13 Dec 2018
2 min read
Save for later

Instaclustr releases three open source projects for Apache Cassandra database users

Sugandha Lahoti
13 Dec 2018
2 min read
Yesterday, Instaclustr released three open source projects easing Cassandra-Kubernetes Integration and LDAP/Kerberos Authentication. These three projects include a Cassandra operator for Kubernetes, LDAP authenticator, and Kerberos authenticator plugins. Cassandra on Kubernetes The Cassandra operator functions as a Cassandra-as-a-Service on Kubernetes, making it easier for developers to combine these technologies. The Cassandra operator provides a consistent environment and set of operations that are reproducible across production clusters and development, staging, and QA environments. The Cassandra operator is now ready to use in development environments through GitHub. Enterprise support for the Cassandra operator will start next year. LDAP authenticator The LDAP authenticator plug-in enables developers to benefit from the secure LDAP authentication without any need to write their own solutions and to transition to using the authenticator with zero downtime. The LDAP authenticator is freely available on GitHub, along with setup and usage instructions. Kerberos authenticator The Kerberos authenticator makes Kerberos’ secure authentication and true single sign-on capabilities available to developers using Apache Cassandra. This project also includes a Kerberos authenticator plugin for the Cassandra Java driver. “With these open source projects, we’ve set out to empower any developer who wishes to pair Cassandra with Kubernetes, or take advantage of LDAP or Kerberos authentication within their Cassandra deployments. We invite anyone interested to join our community of contributors, and suggest or offer improvements to these open source projects.” said Ben Bromhead, CTO. ScyllaDB announces Scylla 3.0, a NoSQL database surpassing Apache Cassandra in features. cstar: Spotify’s Cassandra orchestration tool is now open source! Twitter adopts Apache Kafka as their Pub/Sub System
Read more
  • 0
  • 0
  • 4045

article-image-mongodb-atlas-will-be-available-on-microsoft-azure-marketplace-and-will-be-a-part-of-microsofts-partner-reported-program
Amrata Joshi
04 Sep 2019
2 min read
Save for later

MongoDB Atlas will be available on Microsoft Azure Marketplace and will be a part of Microsoft’s Partner Reported program

Amrata Joshi
04 Sep 2019
2 min read
Yesterday, the team at MongoDB, the general-purpose data platform announced the availability of MongoDB Atlas on Microsoft Azure Marketplace. The team further announced that they are set to be a part of Microsoft’s strategic Partner Reported ACR co-sell program. https://twitter.com/MongoDB/status/1168946141200883713 MongoDB Atlas on Azure integrates with Azure services including Azure Databricks, PowerBI, and Sitecore on Azure. With the availability of MongoDB Atlas on Azure Marketplace it will now be easy for the established Azure customers to purchase MongoDB Atlas. Also, the cost for Atlas will be integrated into a customer’s Azure bill resulting into a single payment.  The Atlas is now available across 26 Azure regions and provides service to thousands of customers that are dependent on MongoDB Atlas for driving their business.  Dev Ittycheria, President and CEO, MongoDB, said, “Microsoft has been a leader in making it easier for customers to consume and pay for cloud services, which are driving transformative innovations across many organizations.”  Ittycheria further added, “We are excited about the latest step in our strategic go-to-market partnership with Microsoft which will help bring MongoDB Atlas to the growing ecosystem of Azure Marketplace customers.” Scott Guthrie, Executive Vice President of Cloud and AI, Microsoft said, “Since launching on Azure in 2017, MongoDB Atlas has been a popular service running on Azure. Today’s announcement will make it even easier for customers to consume Atlas on Azure through the Azure Marketplace. We are committed to working alongside partners like MongoDB to give our joint customers best of breed choice in technology that meets their unique business demands.” What’s new in data this week? How to learn data science: from data mining to machine learning LXD releases Dqlite 1.0, a C library to implement an embeddable, persistent SQL database engine with Raft consensus After Red Hat, Homebrew removes MongoDB from core formulas due to its Server Side Public License adoption  
Read more
  • 0
  • 0
  • 4007
article-image-along-with-platforms-like-facebook-now-websites-using-embedded-like-buttons-are-jointly-responsible-for-what-happens-to-the-collected-user-data-rules-eu-court
Vincy Davis
30 Jul 2019
5 min read
Save for later

Along with platforms like Facebook, now websites using embedded 'Like' buttons are jointly responsible for what happens to the collected user data, rules EU court

Vincy Davis
30 Jul 2019
5 min read
Yesterday, a significant judgement was made on the usage of Facebook’s ‘Like’ feature by third party websites. The Court of Justice of the European Union (ECJ) ruled that the operator of a third party website with an embedded Facebook ‘Like’ button can be held jointly responsible for the initial collection and transmission of the visitor’s personal data to its website, under the European Union’s General Data Protection Regulation (GDPR). It also stated that “By contrast, that operator is not, in principle, a controller in respect of the subsequent processing of those data carried out by Facebook alone.” This ruling was made in a case filed by a German consumer protection association, Verbraucherzentrale NRW against an online clothing retailer Fashion ID. The court found that the installed Facebook 'Like' button on a third party website allows Facebook to collect user’s information without their consent, irrespective of the fact that users did not click the button or were not part of the social media network. According to the press release, the ECJ has set guidelines that third party website operators  must seek consent from site visitors by clarifying the identity and purpose of the information transmission, before the data is handed over to Facebook. It also adds that for lawful purposes, operators “must pursue a legitimate interest through the collection and transmission of personal data in order for those operations to be justified in that regard.” The ‘Like’ button feature, introduced by Facebook 10 years ago,  is one of the most utilized features by users. The feature has also been adapted into most other social media platforms like Youtube, Twitter and Instagram. It makes sharing content or opinions about content on social platforms extremely convenient with a single click. This ruling is significant because many online portals use the 'Like' button to make their products more visible on Facebook and do not bother about the consequences of data sharing with social media platforms. Last year, Facebook had notified the UK parliament that between April 9 and April 16, the 'Like' button appeared on 8.4M websites. This judgement comes as a warning to all third party websites, as they can no longer hide behind Facebook for their complicity in dodgy data gathering practices. Last month, the BBC reported that Facebook uses information from the 'Like' button feature to not only alter newsfeeds and to apply behavioural advertising, but also to use it as a tool to target elections and manipulate people’s emotional state. Addressing the court judgement, Jack Gilbert, Associate general counsel at Facebook, says that “Website plugins are common and important features of the modern Internet. We welcome the clarity that today’s decision brings to both websites and providers of plugins and similar tools.” He further added that they are reviewing the court’s decision and “will work closely with our partners to ensure they can continue to benefit from our social plugins and other business tools in full compliance with the law.” Facebook, which believes there’s no expectation of privacy on social media, has a record of trying to evade or delay justice by manipulating laws using legal loopholes. A GDPR-violation lawsuit filed against Facebook by a privacy activist went on for 5 long years as Facebook constantly questioned whether GDPR-based cases fall under the jurisdiction of courts, until it was rejected by an Austrian Supreme Court this year. Last year, a lawsuit was filed against Facebook over a data breach impacting nearly 30 million users. In response, Facebook argued that some of the leaked information were not sensitive. However, an appellate court in San Francisco ruled against Facebook’s appeal, last month. When caught red-handed, Facebook has attempted to deploy their PR and lobbying juggernaut to turn verdicts in their favour. Two months ago, reports emerged claiming that Facebook allegedly pressured and “arm-wrestled” EU expert group to soften European guidelines for fake news. Facebook’s chief lobbyist, Richard Allan allegedly threatened the expert group by saying, “We are happy to make our contribution, but if you go in that direction, we will be controversial”, and would stop Facebook’s support for journalistic and academic projects. We will have to wait and watch if Facebook and other social media & content platforms will comply with the GDPR data protection framework this time or will it again try to escape laws using its might. https://twitter.com/MaxMoranHi/status/1155855819868688384 https://twitter.com/PepperoniRollz/status/1155785646595817473 Some people are satisfied that the court ruling will ensure third party websites think hard before sharing user information with Facebook. A user on Hacker News comments, “Good. I don't want you telling Facebook I've visited your website.” https://twitter.com/LguzzardiM/status/1155894027046158336 https://twitter.com/ChopinOpera/status/1155962319874080768 To tackle the plague of tracking in Facebook ‘Like’ button, open source developers are coming up with their own solutions. Social Share Privacy is one such plugin project in jQuery. It enables third party websites to disable the Like/recommend button on their websites, by default. Read the Court of Justice of the European Union’s press release for more information. “Why was Rust chosen for Libra?”, US Congressman questions Facebook on Libra security design choices CraftAssist: An open-source framework to enable interactive bots in Minecraft by Facebook researchers Facebook released Hermes, an open source JavaScript engine to run React Native apps on Android
Read more
  • 0
  • 0
  • 3961

article-image-tune-in-to-use-collections-playlists-for-your-data-from-whats-new
Anonymous
10 Nov 2020
5 min read
Save for later

Tune in to use collections, playlists for your data from What's New

Anonymous
10 Nov 2020
5 min read
Ann Ho Senior Product Manager, Tableau Tanna Solberg November 10, 2020 - 4:58pm November 10, 2020 Take a look at what you have in your music library. You’ve got songs from different albums, artists, and even genres. You probably have music for different moods, activities, or times of day. Just like with songs, the data you use isn’t always contained in the same project on your Tableau site. And you might have trouble remembering where to find the assets you don’t use regularly. With our new collections feature, you can gather the data from across your site and organize them to fit how you use them—just like playlists!  If you’re a Tableau Online customer interested in getting an early look at collections, you can sign up to join our Tableau 2020.4 Collections Limited Preview Program. We’ll reach out to enable collections for your site so your users can try it out. Organize your data the way you think about your data Many customers model their Tableau sites and projects after their organizational structures: by departments, regions, or a nested hierarchy of both. But when it comes to using data, most people collaborate with other teams and departments. Often, this means navigating in and out of different projects to get to the right data and content.  Collections introduce the ability to curate content from across various projects—helping users navigate and find the content they need in the context of how they need to use it. You can promote specific content to help new users find relevant data, align content to workflow processes for easier reviewing or archiving, and even use collections in meetings and quarterly reviews to ensure everyone is referencing the same dashboards.  Collections function as lists—you aren’t making copies of your assets, moving them in or out of their project folders, or changing any security permissions. You can keep a collection private just for you, or make it public so others can search for and use it, too. And the same content can be added to different collections, helping keep data conversations centered around a single source of truth. Getting started with collections Once collections are enabled for your Tableau Online site, you’ll see a new option in the left navigation for collections. Here, you’ll find all the collections you have access to—the ones you own and the ones in your site that have been made public. You can also see the collections you own on a tab in My Content. As long as you’re a licensed user on a site, you can create a collection! Click the New Collection button at the top of the collections page to create your collection and edit the default name. As you browse through projects, add content to your collection right from the item’s action menu. Or, select multiple assets and use the multi-select action menu to add them all to a collection. You can add many kinds of data assets from your site to a collection, including workbooks, Prep flows, and even data roles. Currently, you can’t add a collection to another collection, or add custom views, databases, and tables. Sharing collections Just like playlists, you can share collections and help others find the data you’ve curated for specific projects, meetings, or tasks. By default, collections are made private so that only you (and administrators) can see them. To share a collection, you’ll need to change its permissions to public—this allows anyone on the site to browse and find it. You can also choose to send a note to let your colleagues know about the collection. When you use the Share button to send a link to your collection, the recipients will get an email. Plus, the collection will appear in their Shared with Me channel on the Home page in Tableau Online to make it easy for them to find again. Even though the collection is public, the security permissions for each of the items within it won’t change—people will only see the data they have access to. Try collections for yourself—join our limited preview! We’re excited to release this new feature to all of our customers, but first we’d love to hear from you! If you’re a Tableau Online customer and are interested in getting an early look at collections, you can learn more about how to use collections and sign up to join our Tableau 2020.4 Collections Limited Preview Program. We’ll reach out to enable collections for your site so your users can try it out. Your feedback is important to us to better understand how you engage with data so we can build experiences that serve your needs—so thank you! We can’t wait to see all the great ways you’ll use this new feature.
Read more
  • 0
  • 0
  • 3957

Anonymous
17 Sep 2020
7 min read
Save for later

Seizing the moment: Enabling schools to manage COVID-19 using data-driven analysis from What's New

Anonymous
17 Sep 2020
7 min read
Eillie Anzilotti Public Affairs Specialist at Tableau Hannah Kuffner September 17, 2020 - 4:14pm September 17, 2020 Hard as it is to grapple with the many far-reaching impacts of COVID-19 on our culture, it's even harder to imagine how we would have coped with the same situation even a few decades ago. If the pandemic had hit in 2000 instead of 2020, where would we be? The virtual meetings we conduct for our daily work wouldn't be live on video with document sharing; at best, we'd be on conference calls and sending artifacts by fax. Our safety updates about the virus wouldn't arrive via social media; we'd mostly rely on word of mouth, the daily papers, and the evening news. Likewise, schools would have been in even graver danger of being left behind by COVID-19 than they are right now. The virus has made traditional day-to-day K-12 education all but impossible for the moment, and schools are working with the latest technology to serve their needs in the best ways possible. So what does that look like? How are schools meeting the need for solutions that address the breadth AND depth of the problem? How are they coordinating resources across disconnected, socioeconomically diverse student populations, and meeting the needs of all involved? This blog takes a look at two stakeholder groups: students and their families, and teachers and administrators. In each case, we'll see examples of how people are using data to overcome barriers imposed by the pandemic. Gathering data from students and families For most school districts, the first step in providing an alternate system of instruction was to assess what students wanted and needed in order to participate. They distributed surveys to households, both on paper and over the phone and web, to find out each student's readiness for online remote learning. Did they have a laptop or other connected device for attending classes? Did they need a Wi-fi hotspot in order to access the internet? Once online, could they successfully connect to the district's learning systems? In addition to technological readiness, surveys were also useful for tracking students' engagement. "We wanted to know how students were feeling about distance learning," said Hope Langston, director of assessment services for the Northfield Public School District in Minnesota. Survey responses indicated the level of difficulty students have adjusting to the changes, helping identify areas that needed the most urgent attention. Northfield continues to offer follow-up surveys that help measure their progress in addressing these issues over time. Equal Opportunity Schools (EOS), a Seattle-based nonprofit dedicated to improving access for students of color and low-income students, has conducted research throughout the pandemic that assesses remote learning by aggregating various factors of student sentiment, including teacher and principal evaluation, barriers to motivation, and "belonging" as it pertains to their identity, culture, and classroom experience. EOS uses Tableau to relate these factors to one another and gain insights into possible paths for achieving a more comprehensive learning experience for all students. Equal Opportunity Schools student experience dashboard (EOS) But school isn't entirely about teaching and learning. Districts have resources that help make sure students are healthy and safe, and deploying those resources during COVID-19 also requires a data-driven strategy. The El Paso Independent School District input survey data into Tableau visualizations and used them for planning nutritional and medical interventions where they were needed, including conducting telehealth sessions between school nurses and students who fell ill. If a household couldn't be reached for survey or classroom participation, truancy officers investigated to check on the wellbeing of students in their homes. As a district whose majority population is economically disadvantaged, and where one-third of students has limited English proficiency, these interventions were especially important for ensuring effective, equitable outreach. Student response dashboard (El Paso Independent School District) Similarly, Northfield used Tableau to visualize survey responses and other data related to their holistic pandemic response, with a heightened focus on achieving equity for underserved areas of the district's community, one-quarter of whom qualify for free and reduced lunch. Using a need-based "heat map" visualization as a daily tracker, Northfield set up food distribution centers in strategic locations and tracked the number of meals delivered per day at each site. Langston and her team also used the data to mobilize local volunteers to help families with language and socioeconomic challenges navigate connectivity. Meal distribution counts by location (Northfield Healthy Community Initiative) Throughout these efforts, the availability and visibility of survey responses and other data has been key to coordinating an effective response. "Our dashboards help us meet a need in our community, by getting us the information we need as clearly and as quickly as possible," said Langston. "We couldn't have done this if we didn't have an accurate picture of what the need is." Empowering teachers and administrators Teacher readiness—both practical and emotional—is another important factor in achieving effective remote learning operations, and many districts are using a similar survey-based approach to stay connected with educator sentiment. School administrators rely on this information to make sure teachers are getting the resources and support they need, as well as to work with teachers on tracking student engagement and taking action where needed to help their situation improve. In El Paso, Tableau dashboards track data from the district's remote learning platform and student information system to identify gaps in learning and take immediate action to address them. Viewing data at the district and school levels, and then filtering it down to specific populations or individual students, make it easy to quickly identify students at risk and report each case to the relevant principal or teacher. The visualizations are used to lead weekly meetings among school principals and assign school-specific tasks. El Paso Schoolology participation rates (El Paso Independent School District) "Having the dashboards, and being able to quickly export customized reports, meant we could readily engage school administrators," said Steve Clay, Executive Director of Analytics, Strategy, Assessment, and public education information management systems (PEIMS) for El Paso ISD. "As soon as we noticed problematic numbers, we could hand them a list of students and say: Here are the ones who aren't engaging—what's your plan, what haven't you tried yet, and how can we get you some help?" El Paso also used analytics and reporting to track compliance with new participation-based grading systems, which most teachers had never used before COVID-19. When teachers weren't trained or reminded to use the new system, the grades they reported in the remote learning platform didn't match the new guidelines, potentially confounding the ability to measure student progress. By visualizing the data and seeing the discrepancies, administrators could contact and coach the affected teachers and help bring the invalid grades to a point of fidelity with other measurements in the system. Preparing for the immediate and long-term future As schools continue to evolve their policies and practices, both during and after COVID-19, data will play a key role. Districts that lead the way with data-focused innovations are already finding success in adopting new policies that their states set forth. El Paso's approach to remote learning helped it comply with new standards for student engagement imposed by the Texas Education Authority (TEA) and could readily report its progress to TEA using the codes visible in Tableau. The better equipped a district is to tackle problems efficiently and at a granular level, the more capably they can face down unknown challenges in the future. To see example visualizations from El Paso ISD and other K-12 institutions using Tableau, visit the Tableau Use Cases in Education site.
Read more
  • 0
  • 0
  • 3956
article-image-announcing-oracle-solaris-11-4-consistent-secure-and-easy-to-use-platform
Fatema Patrawala
30 Aug 2018
3 min read
Save for later

Announcing Oracle Solaris 11.4: Consistent, secure and easy to use platform

Fatema Patrawala
30 Aug 2018
3 min read
Oracle announced the release of Oracle Solaris 11.4, a trusted business platform. Oracle Solaris gives consistent compatibility, is secure and simple to use platform. The version 11.4 is the first and the only operating system with a complete UNIX® V7 certification. Check out these facts about Oracle Solaris 11.4: The team worked on 175 development builds to get Oracle Solaris 11.4 It has been tested for more than 30 million machine hours 50 customers have put Oracle Solaris into production More than 3000 applications are certified to run on it New features in Oracle Solaris 11.4 Consistently compatible Major reason for companies and organizations behind choosing Oracle Solaris is its continued consistency. The Oracle Solaris Application Compatibility Guarantee program guarantees that it will work seamlessly on previous releases of Oracle Solaris. Additionally you can migrate Oracle 10 workloads to Oracle 11 with enhanced migration tools and documentation available for modern hardware. Simple Interface A new feature, Observability Tools System Web Interface brings together several key observability technologies. It includes the new StatsStore data, audit events and FMA events, into a centralized, customizable browser-based interface, that allows you to see the current and past system behavior at a glance. It will also allow you to add your own data for collection and customize the interface as you like. The Service Management Framework has been enhanced to allow you to automatically monitor and restart critical applications and services. Oracle Solaris Zones are now updated and the applications inside it can be run simply with the ability to evacuate a system of all of its Zones with just one command. With Oracle Solaris 11.4, you can now build intra-Zone dependencies and have the dependent Zones boot in the correct order. This will enable you to automatically boot and restart complex application stacks in the correct order. Safe and Secure Oracle Solaris 11.4 will give more security capabilities with multi-node compliance to  stay secure and compliant. You will be able to setup compliance to either push a compliance assessment to all systems with a single command and review the results in a single report. Alternatively, you can setup your systems to regularly generate their compliance reports and push them to a central server which can be viewed via a single report. Trusted path services are added in Oracle Solaris 11.4, to create your own services like Puppet and Chef, that can be placed on the trusted path. It will allow you to make the requisite changes while keeping the system/zone immutable and protected. With update to Oracle Solaris the team released a new version of Oracle Solaris Cluster 4.4. To know more about this release and to download Oracle Solaris 11.4 visit the Oracle Technology Network page. Oracle releases GraphPipe: An open source tool that standardizes machine learning model deployment Oracle’s bid protest against U.S Defence Department’s(Pentagon) $10 billion cloud contract Oracle makes its Blockchain cloud service generally available
Read more
  • 0
  • 0
  • 3914

article-image-tableau-prep-builder-now-available-in-the-browser-from-whats-new
Anonymous
02 Dec 2020
4 min read
Save for later

Tableau Prep Builder now available in the browser from What's New

Anonymous
02 Dec 2020
4 min read
Rapinder Jawanda Spencer Czapiewski December 2, 2020 - 8:46pm December 15, 2020 With the arrival of Tableau 2020.4, we've made exciting advancements for self-service data prep. Now, you can create new Tableau Prep flows as well as edit existing flows directly in the browser. Since all your analytical work can be done conveniently in one spot on the server, web authoring helps analysts eliminates the context switching between creating in the desktop and moving the flow to the server. For IT Admins, web authoring simplifies the deployment experience and provides more visibility into the data prep process, enabling better data management. A simpler, smoother data prep experience for all Web authoring helps analysts by providing an integrated platform to work completely in the browser. You can create data sources, schedule runs, and use those data sources within their workbooks all on your server. No more costly context switching between platforms and tools—everything can now be done in one place, all from anywhere you have access to a browser.  You can create workbooks and flows directly on the web by selecting from the “New” dropdown on the Start page, Explore page, or Data Source page. We have designed the new browser experience to include autosaving as well. When you create or edit a flow in web authoring, your changes are automatically saved in a draft—no need to explicitly save your flow, and no risk of losing work in progress. You will see your changes being saved in the header. Since all your Prep work is now on the same server, everything you do in Prep web authoring is automatically compatible with your version of Tableau Server or Online. Everyone in the organization will get the latest version of Prep Builder in their browser when Server or Online is upgraded. Users only need to have a supported browser on their machine to start creating and editing flows. This means zero installs for users and less work for IT admins.  Update your flows faster Prep web authoring allows you to update flows faster because you don’t have to download the flow, open it in desktop, and then republish the updated flow. Instead, you can click on the “Edit” link and open the flow to make a change directly in the browser. Fewer overall steps and no context switching means increased productivity.  Improved data governance As an IT professional, you will have visibility into all flows that are created or being edited, giving you more control over data and resource usage. Remove unused flows or preserve resources by preventing multiple users from running the same flow. You can even put Prep web authoring on a separate node as part of your scale-out plan.  Prep web authoring allows your flows to be fully integrated with Tableau Catalog, part of our Data Management offering. You get complete visibility into the flows being created and run since all of them are now on the server rather than on individual desktops. With Catalog’s lineage and impact analysis, you can easily track the data sources being created with flows and see which workbooks are using them.  Get started today To get started with Tableau Prep on the browser, simply upgrade Tableau Server to version 2020.4, then enable flows in web authoring. For more information, read about these settings and topology changes. Now you're ready to start creating flows in the browser. Just click “New” > “Flow” on your Explore page and you can start building your flow just like in Tableau Prep Builder! Want to learn more? Interested in more details about Tableau Prep in the browser? For more information, see our Help documentation for Tableau Prep on the Web.  Eager to learn more about how to use Tableau Prep? Head over to the Tableau eLearning site and check out the Prep Builder course!
Read more
  • 0
  • 0
  • 3870
Modal Close icon
Modal Close icon