Data Visualization Schulungen

Erfahrungsberichte

Data Visualization

I thought that the information was interesting.

Allison May - Virginia Department of Education

Data Visualization

I really appreciated that Jeff utilized data and examples that were applicable to education data. He made it interesting and interactive.

Carol Wells Bazzichi - Virginia Department of Education

Data Visualization

Learning about all the chart types and what they are used for. Learning the value of decluttering. Learning about the methods to show time data.

Susan Williams - Virginia Department of Education

Data Visualization

Trainer was enthusiastic.

Diane Lucas - Virginia Department of Education

Data Visualization

Content / Instructor

Craig Roberson - Virginia Department of Education

Data Visualization

I am a hands-on learner and this was something that he did a lot of.

Lisa Comfort - Virginia Department of Education

Data Visualization

The examples.

peter coleman - Virginia Department of Education

Data Visualization

The examples.

peter coleman - Virginia Department of Education

Data Visualization

Good real world examples, reviews of existing reports

Ronald Parrish - Virginia Department of Education

A practical introduction to Data Analysis and Big Data

Willingness to share more

Balaram Chandra Paul - MOL Information Technology Asia Limited

Data Visualization Schulungsübersicht

Code Name Dauer Übersicht
pythonmultipurpose Advanced Python 28 hours In this instructor-led training, participants will learn advanced Python programming techniques, including how to apply this versatile language to solve problems in areas such as distributed applications, finance, data analysis and visualization, UI programming and maintenance scripting. Audience Developers Format of the course Part lecture, part discussion, exercises and heavy hands-on practice Notes If you wish to add, remove or customize any section or topic within this course, please contact us to arrange.   Introduction     Python versatility: from data analysis to web crawling Python data structures and operations     Integers and floats     Strings and bytes     Tuples and lists     Dictionaries and ordered dictionaries     Sets and frozen sets     Data frame (pandas)     Conversions Object-oriented programming with Python     Inheritance     Polymorphism     Static classes     Static functions     Decorators     Other Data Analysis with pandas     Data cleaning     Using vectorized data in pandas     Data wrangling     Sorting and filtering data     Aggregate operations     Analyzing time series Data visualization     Plotting diagrams with matplotlib     Using matplotlib from within pandas     Creating quality diagrams     Visualizing data in Jupyter notebooks     Other visualization libraries in Python Vectorizing Data in Numpy     Creating Numpy arrays     Common operations on matrices     Using ufuncs     Views and broadcasting on Numpy arrays     Optimizing performance by avoiding loops     Optimizing performance with cProfile Processing Big Data with Python     Building and supporting distributed applications with Python     Data storage: Working with SQL and NoSQL databases     Distributed processing with Hadoop and Spark     Scaling your applications Python for finance     Packages, libraries and APIs for financial processing         Zipline         PyAlgoTrade         Pybacktest         quantlib         Python APIs Extending Python (and vice versa) with other languages     C#     Java     C++     Perl     Others Python multi-threaded programming     Modules     Synchronizing     Prioritizing UI programming with Python     Framework options for building GUIs in Python         Tkinter         Pyqt Python for maintenance scripting     Raising and catching exceptions correctly     Organizing code into modules and packages     Understanding symbol tables and accessing them in code     Picking a testing framework and applying TDD in Python Python for the web     Packages for web processing     Web crawling     Parsing HTML and XML     Filling web forms automatically Closing remarks
zeppelin Zeppelin for interactive data analytics 14 hours Apache Zeppelin is a web-based notebook for capturing, exploring, visualizing and sharing Hadoop and Spark based data. This instructor-led, live training introduces the concepts behind interactive data analytics and walks participants through the deployment and usage of Zeppelin in a single-user or multi-user environment. By the end of this training, participants will be able to: Install and configure Zeppelin Develop, organize, execute and share data in a browser-based interface Visualize results without referring to the command line or cluster details Execute and collaborate on long workflows Work with any of a number of plug-in language/data-processing-backends, such as Scala ( with Apache Spark ), Python ( with Apache Spark ), Spark SQL, JDBC, Markdown and Shell. Integrate Zeppelin with Spark, Flink and Map Reduce Secure multi-user instances of Zeppelin with Apache Shiro Audience Data engineers Data analysts Data scientists Software developers Format of the course Part lecture, part discussion, exercises and heavy hands-on practice To request a customized course outline for this training, please contact us.  
datavis1 Data Visualization 28 hours This course is intended for engineers and decision makers working in data mining and knoweldge discovery. You will learn how to create effective plots and ways to present and represent your data in a way that will appeal to the decision makers and help them to understand hidden information. Day 1: what is data visualization why it is important data visualization vs data mining human cognition HMI common pitfalls Day 2: different type of curves drill down curves categorical data plotting multi variable plots data glyph and icon representation Day 3: plotting KPIs with data R and X charts examples what if dashboards parallel axes mixing categorical data with numeric data Day 4: different hats of data visualization how can data visualization lie disguised and hidden trends a case study of student data visual queries and region selection
datavisR1 Introduction to Data Visualization with R 28 hours This course is intended for data engineers, decision makers and data analysts and will lead you to create very effective plots using R studio that appeal to decision makers and help them find out hidden information and take the right decisions   Day 1: overview of R programming introduction to data visualization scatter plots and clusters the use of noise and jitters Day 2: other type of 2D and 3D plots histograms heat charts categorical data plotting Day 3: plotting KPIs with data R and X charts examples dashboards parallel axes mixing categorical data with numeric data Day 4: different hats of data visualization disguised and hidden trends case studies saving plots and loading Excel files
surveyp Research Survey Processing 28 hours This four day course walks you from the point you design your research surveys to the tme where you gather and collect the findings of the survey. The course is based on Excel and Matlab. You will learn how to design the survey form and what the suitable data fields should be, and how to process extra data information when needed. The course will show you the way the data is entered and how to validate and correct wrong data values. At the end the data analysis will be conducted in a variety of ways to ensure the effectiveness of the data gathered and to find out hidden trends and knowledge within this data. A number of case studies will be carried out during the course to make sure all the concepts have been well understood.Day 1: Data analysis Determining the Target of the survey Survey Design data fields and their types dealing with drill down surveies Data Collection Data Entry Excel Session Day 2: Data cleaning Data reduction Data Sampling Removing unexpcted data Removing outlier Data Analysis statstics is not enough Excel Session Day 3: Data visualization parallel cooridnates scatter plot pivot tables cross tables Excel Session Conducting data mining algorithms on the data Decision tree Clustering mining assoiciation rules matlab session Day 4: Reporting and Disseminating Results Archiving data and the finding out Feedback for conducting new surveies
kdd Knowledge Discover in Databases (KDD) 21 hours Knowledge discovery in databases (KDD) is the process of discovering useful knowledge from a collection of data. Real-life applications for this data mining technique include marketing, fraud detection, telecommunication and manufacturing. In this course, we introduce the processes involved in KDD and carry out a series of exercises to practice the implementation of those processes. Audience     Data analysts or anyone interested in learning how to interpret data to solve problems Format of the course     After a theoretical discussion of KDD, the instructor will present real-life cases which call for the application of KDD to solve a problem. Participants will prepare, select and cleanse sample data sets and use their prior knowledge about the data to propose solutions based on the results of their observations. Introduction     KDD vs data mining Establishing the application domain Establishing relevant prior knowledge Understanding the goal of the investigation Creating a target data set Data cleaning and preprocessing Data reduction and projection Choosing the data mining task Choosing the data mining algorithms Interpreting the mined patterns
OpenNN OpenNN: Implementing neural networks 14 hours OpenNN is an open-source class library written in C++  which implements neural networks, for use in machine learning. In this course we go over the principles of neural networks and use OpenNN to implement a sample application. Audience     Software developers and programmers wishing to create Deep Learning applications. Format of the course     Lecture and discussion coupled with hands-on exercises. Introduction to OpenNN, Machine Learning and Deep Learning Downloading OpenNN Working with Neural Designer     Using Neural Designer for descriptive, diagnostic, predictive and prescriptive analytics OpenNN architecture     CPU parallelization OpenNN classes     Data set, neural network, loss index, training strategy, model selection, testing analysis     Vector and matrix templates Building a neural network application     Choosing a suitable neural network     Formulating the variational problem (loss index)     Solving the reduced function optimization problem (training strategy) Working with datasets      The data matrix (columns as variables and rows as instances) Learning tasks     Function regression     Pattern recognition Compiling with QT Creator Integrating, testing and debugging your application The future of neural networks and OpenNN
druid Druid: Build a fast, real-time data analysis system 21 hours Druid is an open-source, column-oriented, distributed data store written in Java. It was designed to quickly ingest massive quantities of event data and execute low-latency OLAP queries on that data. Druid is commonly used in business intelligence applications to analyze high volumes of real-time and historical data. It is also well suited for powering fast, interactive, analytic dashboards for end-users. Druid is used by companies such as Alibaba, Airbnb, Cisco, eBay, Netflix, Paypal, and Yahoo. In this course we explore some of the limitations of data warehouse solutions and discuss how Druid can compliment those technologies to form a flexible and scalable streaming analytics stack. We walk through many examples, offering participants the chance to implement and test Druid-based solutions in a lab environment. Audience     Application developers     Software engineers     Technical consultants     DevOps professionals     Architecture engineers Format of the course     Part lecture, part discussion, heavy hands-on practice, occasional tests to gauge understanding Introduction Installing and starting Druid Druid architecture and design Real-time ingestion of event data Sharding and indexing Loading data Querying data Visualizing data Running a distributed cluster Druid + Apache Hive Druid + Apache Kafka Druid + others Troubleshooting Administrative tasks
octnp Octave not only for programmers 21 hours Course is dedicated for those who would like to know an alternative program to the commercial MATLAB package. The three-day training provides comprehensive information on moving around the environment and performing the OCTAVE package for data analysis and engineering calculations. The training recipients are beginners but also those who know the program and would like to systematize their knowledge and improve their skills. Knowledge of other programming languages is not required, but it will greatly facilitate the learners' acquisition of knowledge. The course will show you how to use the program in many practical examples. Introduction Simple calculations Starting Octave, Octave as a calculator, built-in functions The Octave environment Named variables, numbers and formatting, number representation and accuracy, loading and saving data  Arrays and vectors Extracting elements from a vector, vector maths Plotting graphs Improving the presentation, multiple graphs and figures, saving and printing figures Octave programming I: Script files Creating and editing a script, running and debugging scripts, Control statements If else, switch, for, while Octave programming II: Functions Matrices and vectors Matrix, the transpose operator, matrix creation functions, building composite matrices, matrices as tables, extracting bits of matrices, basic matrix functions Linear and Nonlinear Equations More graphs Putting several graphs in one window, 3D plots, changing the viewpoint, plotting surfaces, images and movies,  Eigenvectors and the Singular Value Decomposition  Complex numbers Plotting complex numbers,  Statistics and data processing  GUI Developmen
nlpwithr NLP: Natural Language Processing with R 21 hours It is estimated that unstructured data accounts for more than 90 percent of all data, much of it in the form of text. Blog posts, tweets, social media, and other digital publications continuously add to this growing body of data. This course centers around extracting insights and meaning from this data. Utilizing the R Language and Natural Language Processing (NLP) libraries, we combine concepts and techniques from computer science, artificial intelligence, and computational linguistics to algorithmically understand the meaning behind text data. Data samples are available in various languages per customer requirements. By the end of this training participants will be able to prepare data sets (large and small) from disparate sources, then apply the right algorithms to analyze and report on its significance. Audience     Linguists and programmers Format of the course     Part lecture, part discussion, heavy hands-on practice, occasional tests to gauge understanding Introduction     NLP and R vs Python Installing and configuring R Studio Installing R packages related to Natural Language Processing (NLP). An overview of R’s text manipulation capabilities Getting started with an NLP project in R Reading and importing data files into R Text manipulation with R Document clustering in R Parts of speech tagging in R Sentence parsing in R Working with regular expressions in R Named-entity recognition in R Topic modeling in R Text classification in R Working with very large data sets Visualizing your results Optimization Integrating R with other languages (Java, Python, etc.) Closing remarks
BigData_ A practical introduction to Data Analysis and Big Data 28 hours Participants who complete this training will gain a practical, real-world understanding of Big Data and its related technologies, methodologies and tools. Participants will have the opportunity to put this knowledge into practice through hands-on exercises. Group interaction and instructor feedback make up an important component of the class. The course starts with an introduction to elemental concepts of Big Data, then progresses into the programming languages and methodologies used to perform Data Analysis. Finally, we discuss the tools and infrastructure that enable Big Data storage, Distributed Processing, and Scalability. Audience Developers / programmers IT consultants Format of the course     Part lecture, part discussion, heavy hands-on practice and implementation, occasional quizing to measure progress. Introduction to Data Analysis and Big Data What makes Big Data "big"? Velocity, Volume, Variety, Veracity (VVVV) Limits to traditional Data Processing Distributed Processing Statistical Analysis Types of Machine Learning Analysis Data Visualization Languages used for Data Analysis R language (crash course) Why R for Data Analysis? Data manipulation, calculation and graphical display Python (crash course) Why Python for Data Analysis? Manipulating, processing, cleaning, and crunching data Approaches to Data Analysis Statistical Analysis Time Series analysis Forecasting with Correlation and Regression models Inferential Statistics (estimating) Descriptive Statistics in Big Data sets (e.g. calculating mean) Machine Learning Supervised vs unsupervised learning Classification and clustering Estimating cost of specific methods Filtering Natural Language Processing Processing text Understaing meaning of the text Automatic text generation Sentiment/Topic Analysis Computer Vision Acquiring, processing, analyzing, and understanding images Reconstructing, interpreting and understanding 3D scenes Using image data to make decisions Big Data infrastructure Data Storage Relational databases (SQL) MySQL Postgres Oracle Non-relational databases (NoSQL) Cassandra MongoDB Neo4js Understanding the nuances Hierarchical databases Object-oriented databases Document-oriented databases Graph-oriented databases Other Distributed Processing Hadoop HDFS as a distributed filesystem MapReduce for distributed processing Spark All-in-one in-memory cluster computing framework for large-scale data processing Structured streaming Spark SQL Machine Learning libraries: MLlib Graph processing with GraphX Search Engines ElasticSearch Solr Scalability Public cloud AWS, Google, Aliyun, etc. Private cloud OpenStack, Cloud Foundry, etc. Auto-scalability Choosing right solution for the problem The future of Big Data Closing remarks  
neo4j Beyond the relational database: neo4j 21 hours Relational, table-based databases such as Oracle and MySQL have long been the standard for organizing and storing data. However, the growing size and fluidity of data have made it difficult for these traditional systems to efficiently execute highly complex queries on the data. Imagine replacing rows-and-columns-based data storage with object-based data storage, whereby entities (e.g., a person) could be stored as data nodes, then easily queried on the basis of their vast, multi-linear relationship with other nodes. And imagine querying these connections and their associated objects and properties using a compact syntax, up to 20 times lighter than SQL. This is what graph databases, such as neo4j offer. In this hands-on course, we will set up a live project and put into practice the skills to model, manage and access your data. We contrast and compare graph databases with SQL-based databases as well as other NoSQL databases and clarify when and where it makes sense to implement each within your infrastructure. Audience Database administrators (DBAs) Data analysts Developers System Administrators DevOps engineers Business Analysts CTOs CIOs Format of the course Heavy emphasis on hands-on practice. Most of the concepts are learned through samples, exercises and hands-on development.   Getting started with neo4j neo4j vs relational databases neo4j vs other NoSQL databases Using neo4j to solve real world problems Installing neo4j Data modeling with neo4j Mapping white-board diagrams and mind maps to neo4j Working with nodes Creating, changing and deleting nodes Defining node properties Node relationships Creating and deleting relationships Bi-directional relationships Querying your data with Cypher Querying your data based on relationships MATCH, RETURN, WHERE, REMOVE, MERGE, etc. Setting indexes and constraints Working with the REST API REST operations on nodes REST operations on relationships REST operations on indexes and constraints Accessing the core API for application development Working with NET, Java, Javascript, and Python APIs Closing remarks  
scilab Scilab 14 hours Scilab is a well-developed, free, and open-source high-level language for scientific data manipulation. Used for statistics, graphics and animation, simulation, signal processing, physics, optimization, and more, its central data structure is the matrix, simplifying many types of problems compared to alternatives such as FORTRAN and C derivatives. It is compatible with languages such as C, Java, and Python, making it suitable as for use as a supplement to existing systems. In this instructor-led training, participants will learn the advantages of Scilab compared to alternatives like Matlab, the basics of the Scilab syntax as well as some advanced functions, and interface with other widely used languages, depending on demand. The course will conclude with a brief project focusing on image processing. By the end of this training, participants will have a grasp of the basic functions and some advanced functions of Scilab, and have the resources to continue expanding their knowledge. Audience Data scientists and engineers, especially with interest in image processing and facial recognition Format of the course Part lecture, part discussion, exercises and intensive hands-on practice, with a final project Introduction    Comparison with other languages Getting started Matrix operations Multidimensional data Plotting and exporting graphics Creating an ATOMS toolbox Interface with C, Java, and others Final project: Image analysis Closing remarks    Overview of useful libraries and extensions
matlabdsandreporting MATLAB Fundamentals, Data Science & Report Generation 126 hours In the first part of this training, we cover the fundamentals of MATLAB and its function as both a language and a platform.  Included in this discussion is an introduction to MATLAB syntax, arrays and matrices, data visualization, script development, and object-oriented principles. In the second part, we demonstrate how to use MATLAB for data mining, machine learning and predictive analytics. To provide participants with a clear and practical perspective of MATLAB's approach and power, we draw comparisons between using MATLAB and using other tools such as spreadsheets, C, C++, and Visual Basic. In the third part of the training, participants learn how to streamline their work by automating their data processing and report generation. Throughout the course, participants will put into practice the ideas learned through hands-on exercises in a lab environment. By the end of the training, participants will have a thorough grasp of MATLAB' capabilities and will be able to employ it for solving real-world data science problems as well as for streamlining their work through automation. Assessments will be conducted throughout the course to guage progress. Format of the course Course includes theoretical and practical exercises, including case discussions, sample code inspection, and hands-on implementation. Note Practice sessions will based on pre-arranged sample data report templates. If you have specific requirements, please contact us to arrange Introduction MATLAB for data science and reporting   Part 01: MATLAB fundamentals Overview     MATLAB for data analysis, visualization, modeling, and programming. Working with the MATLAB user interface Overview of MATLAB syntax Entering commands     Using the command line interface Creating variables     Numeric vs character data Analyzing vectors and matrices     Creating and manipulating     Performing calculations Visualizing vector and matrix data Working with data files     Importing data from Excel spreadsheets Working with data types     Working with table data Automating commands with scripts     Creating and running scripts     Organizing and publishing your scripts Writing programs with branching and loops     User interaction and flow control Writing functions     Creating and calling functions     Debugging with MATLAB Editor Applying object-oriented programming principles to your programs   Part 02: MATLAB for data science Overview     MATLAB for data mining, machine learning and predictive analytics Accessing data     Obtaining data from files, spreadsheets, and databases     Obtaining data from test equipment and hardware     Obtaining data from software and the Web Exploring data     Identifying trends, testing hypotheses, and estimating uncertainty Creating customized algorithms Creating visualizations Creating models Publishing customized reports Sharing analysis tools     As MATLAB code     As standalone desktop or Web applications Using the Statistics and Machine Learning Toolbox Using the Neural Network Toolbox   Part 03: Report generation Overview     Presenting results from MATLAB programs, applications, and sample data     Generating Microsoft Word, PowerPoint®, PDF, and HTML reports.     Templated reports     Tailor-made reports         Using organization’s templates and standards Creating reports interactively vs programmatically     Using the Report Explorer     Using the DOM (Document Object Model) API Creating reports interactively using Report Explorer     Report Explorer Examples         Magic Squares Report Explorer Example     Creating reports         Using Report Explorer to create report setup file, define report structure and content     Formatting reports         Specifying default report style and format for Report Explorer reports     Generating reports         Configuring Report Explorer for processing and running report     Managing report conversion templates         Copying and managing Microsoft Word , PDF, and HTML conversion templates for Report Explorer reports     Customizing Report Conversion templates         Customizing the style and format of Microsoft Word and HTML conversion templates for Report Explorer reports     Customizing components and style sheets         Customizing report components, define layout style sheets Creating reports programmatically in MATLAB     Template-Based Report Object (DOM) API Examples         Functional report         Object-oriented report         Programmatic report formatting     Creating report content         Using the Document Object Model (DOM) API     Report format basics         Specifying format for report content     Creating form-based reports         Using the DOM API to fill in the blanks in a report form     Creating object-oriented reports         Deriving classes to simplify report creation and maintenance     Creating and formatting report objects         Lists, tables, and images     Creating DOM Reports from HTML         Appending HTML string or file to a Microsoft® Word, PDF, or HTML report generated by Document Object Model (DOM) API     Creating report templates         Creating templates to use with programmatic reports     Formatting page layouts         Formatting pages in Microsoft Word and PDF reports Summary and closing remarks

Other regions

Data Visualization Schulung, Data Visualization boot camp, Data Visualization Abendkurse, Data Visualization Wochenendkurse , Data Visualization Training, Data Visualization Seminar, Data Visualization Privatkurs, Data Visualization Seminare,Data Visualization Kurs, Data Visualization Coaching

Spezialangebote

Course Discounts Newsletter

We respect the privacy of your email address. We will not pass on or sell your address to others.
You can always change your preferences or unsubscribe completely.

EINIGE UNSERER KUNDEN