SAP Data Services(BODS)Extraction,Transformation and Loading
4.2 (130 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
806 students enrolled
Wishlisted Wishlist

Please confirm that you want to add SAP Data Services(BODS)Extraction,Transformation and Loading to your Wishlist.

Add to Wishlist

SAP Data Services(BODS)Extraction,Transformation and Loading

Learn SAP BO Data Services hands on to help you with ETL projects, moving and transforming complex data
Best Selling
4.2 (130 ratings)
Instead of using a simple lifetime average, Udemy calculates a course's star rating by considering a number of different factors such as the number of ratings, the age of ratings, and the likelihood of fraudulent ratings.
806 students enrolled
Created by Junaid Ahmed
Last updated 6/2016
English
Curiosity Sale
Current price: $10 Original price: $200 Discount: 95% off
30-Day Money-Back Guarantee
Includes:
  • 4.5 hours on-demand video
  • 19 Articles
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
What Will I Learn?
  • Understand Data Services big picture
  • Work with hetrogeneous Source and Target Connections
  • Build ETL Jobs using data services transforms and functions
  • Trace, Validate, and Debug Data Services Jobs
  • Work with Global/Local variables, scripts, Custom Functions
  • Implement Change Data Capture in Data Services
  • Implement error handling
View Curriculum
Requirements
  • Basic understanding about RDMS and the SQL language
  • Get server access
Description

SAP Data services is leader in the data integration and the transformation space. It is a critical component of the SAP HANA solution. With SAP Data Services, you can quickly discover, cleanse, and integrate data – and make it available for real-time analysis. Discover all formats of data in your organization including unstructured data to understand the business better, discover profitable patterns and make better decisions.

Key Learnings:

  • Understand Data Services big picture
  • Work with heterogeneous Source and Target Connections
  • Build ETL Jobs using data services transforms and functions
  • Trace, Validate, and Debug Data Services Jobs
  • Work with Global/Local variables, scripts, Custom Functions
  • Implement Change Data Capture in Data Services
  • Implement error handling

Engaging Teaching:

  1. Expressive and immersive teaching with demonstrations and learning tools
  2. Small, purposeful videos, packed with information, designed specifically for a virtual self paced audience.
  3. Exercises for hands on practice


Who is the target audience?
  • SAP HANA Consultants
  • ETL Developers
  • Data Consultants
  • Solution Architects
  • Managers
  • Super/Power users
Students Who Viewed This Course Also Viewed
Curriculum For This Course
62 Lectures
04:16:56
+
Introduction to the course,use case, tool and basics
3 Lectures 13:27

Hello !!! Welcome .

In this video I will walk you through the course structure, the layout. What you except from each topic and some tips for best learning outcomes.

Preview 03:31

In this video we will understand the ETL process and its aspects from a 50,000 feet level and set the precedence for the coming lectures.

What is ETL ?,and need for it
05:28

We will understand the positioning of the tool, its core functionalities, its connectivity capabilities and technical and business benefits.

How Data Services can help us and its core functionalities
04:28
+
Architecture and GUI
2 Lectures 18:08

Here we look into the  several unique components Data Services relies on to accomplish the data integration and data quality activities required to manage corporate data.

BODS Architecture
09:55

Data Services Designer is a Windows client application used to create, test, and manually execute
jobs that transform data and populate a data warehouse. Using the Designer, create data management applications that consist of data mappings, transformations, and control logic.

The Designer - your bread and butter
08:13
+
Connect to DB's, Flat Files, XML data sources
12 Lectures 33:30

You are responsible for extracting data into the company's Business Warehouse system and want to convert it using Data Services as the new data transfer process.

A datastore provides a connection or multiple connections to data sources such as a database.
Using the datastore connection. Data Services can import the metadata that describes the data
from the data source

Concept of Datastores in Data Services
02:20

You are working as an ETL developer using SAP Data Services Designer. You will create data stores for the source, target, and staging databases.

Preview 03:04

You are working as an ETL developer using SAP Data Services Designer. You will create data stores for the source, target, and staging databases.

How to connect to SQL Server
01:32

Data Services determines and stores a specific set of metadata information for tables. Import
metadata by naming, searching, and browsing. After importing metadata, edit column names.
descriptions, and data types.

How to discover meta data and profile data of the datastores
06:00

Exercise 1
00:02

Lets connect to an SAP ERP system from Data Services.For most Enterprise Resource Planning (ERP) applications. Data Services generates SQL that is optimized for the specific target database (for example. Oracle. DB2, SQL Server. and Inform ix).

Create a data store to connect to a SAP ERP System
03:16

we will learn to Use template tables in early application development when you are designing and testing a
project

Template Tables
00:52

We well see how to work with flat files, Create file formats, work with different delimiters.

Preview 04:56

Exercise 2
00:02

Here we learn how to read multiple flat files with identical formats with a single file format.
By substituting a wild card character or list of file names.

Working with multiple flat files
04:05

We will learn how to import create excel workbooks, create formats , filter data and more.

How to import Excel files
03:03

Data services can interpret XML's as well. lets see how to create xml file formats and import data

How to work with XML files
04:17
+
Concept, Working and Scheduling of a ETL job
5 Lectures 26:10

Lets understand the Data services job structure, the different components and how to sequence them in parallel and series.

Create an ETL Job in DS , Structure and Sequencing
04:45

Lets see how to build Data flows they contain the source, transform, and target objects that represent the key activities in data integration and data quality processes.

Preview 11:15

Exercise 3
00:02

Trouble shooting is a key skill for the developers , it starts with Understanding where to find the error logs, trace's and job statistics.

Where are the errors, logs and statistics - Top customer priority
04:56

Top customer concern is to be able to schedule a job for automatic execution. lets see how to do that.

A job is the only executable object in Data Services. When developing data flows, you can
manually execute and test jobs directly in Data Services.

Scheduling a Job - top customer Priority
05:12
+
Platform Transforms - They make developers life easy!
10 Lectures 38:58

Here we will see that the Query transform is the most commonly used transform, and is included in most data flows. enables you to select data from a source and filter it or reformat it as it moves to the target.

Using the QUERY Transform (filter, Join and more)
10:33

Exercise 4
00:02

Here we learn to use the Case transform to simplify branch logic in data flows by consolidating case or decision making logic into one transform.
 

Using the Case Transform
08:43

Exercise 5
00:02

Here we learn to use the Merge transform to combine incoming data sets with the same schema structure.
the merge produces a single output data set with the same schema as the input data sets.

Using the Merge Transform
05:55

Exercise 6
00:02


The Validation transform enables you to create validation rules and move data into target objects
based on whether they pass or fail validations

Using the Validation Transform
10:05

Exercise 7
00:02

Use the SQL transform to submit SQL commands that generate data to be moved into target objects

Using the SQL Transform
03:32

Exercise 8
00:02
+
Using Variables. Parameters, and Scripts
4 Lectures 32:50

Local variables are restricted to the job or work flow in which they are created. Use parameters to
pass local variables to the work flows and data flows in the object. A local variable is included as
part of the definition of the work flow or data flow, and so it is portable between jobs.
Global variables are also restricted to the job in which they are created. However, they do not
require parameters to be passed to work flows and data flows in that job. You can reference the
global variable directly in expressions for any object of the job.

Global , Local , Substitution Variables
05:11

A script is a single-use object that is used to call functions and assign values in a work flow.
Execute a script before data flows for initialization steps and use a script with conditionals to
determine execution paths. You may also use a script after work flows or data flows to record
execution information such as time, or to record a change in the number of rows in a data set.
Use a script to calculate values that are passed on to other parts of the work flow or to assign
values to variables and execute functions.

Creating a Script and demo for Global, Local and Substitution variables
14:14

If the built-in functions that are provided by Data Services do not meet your requirements. you
can create your own custom functions using the Data Services scripting language.
Create your own functions by writ ing script functions in the Data Services scripting language
using the Smart Editor. Saved custom functions appear under the Custom Functions category in
the Function Wizard and the Smart Editor. Custom functions are also displayed on t he Custom

Creating and Using Custom Functions
13:23

Exercise 9
00:02
+
Using Built-In Functions
7 Lectures 16:02

Retrieve a value in a table or file based on the values in a different source table or file:
• Return multiple columns from a single lookup.
• Choose from additional operators to specify a lookup condition.
• Specify a return policy for your lookup.
• Perform multiple look ups.

Use the lookup_ext() Function
03:40

Exercise 10
00:01

 It performs a simple search and replace based on a string value, word value, or an entire field.

Use the search_replace Function
02:39

Exercise 11
00:02

# Print functions

print('Host Name: [host_name()]');

print('Repository Name: [repository_name()]');

print('Starting execution of Job: [job_name()] as user: [system_user_name()]');

print('Work Flow Name: [workflow_name()]');

# SYSTEM Info

print('System Date: [sysdate()]');
print('System Time: [systime()]');

print('System_user_name: [system_user_name()]');

Functions to get object info(job, work flow, data flow, system)
02:27

$FileName_Pre  = 'C:\Users\hana1\Desktop\CSV files\CSV.txt';
$FileName= $FileName_Pre;
print ($FileName );
$NewFileName = 'C:\Users\hana1\Desktop\CSV files\2\CDVNEW.txt';
If ( file_exists ( $FileName ) = 1 )
begin
print('File exists');
file_move($FileName,$NewFileName,1);
print('File has been copied');
#file_delete($FileName);
#print('Old File has been deleted');
end
else
begin
print( '**********************************');
print( '*****File does not exist**********');
print( '*****Waiting For The File*********');
wait_for_file($FileName, 120000, 60000);
end

Function for file processing like check existence of a file
04:48

# To Char functions

print(to_char(sysdate(), 'MM'));
print(to_char(sysdate(), 'yyyy.mm.dd'));
print(to_char(sysdate(), 'dd.mm.yyyy'));
#to_char(sysdate(),’MONTH’)
#to_char(sysdate(),’DD’)
#to_char(sysdate(),’YY’)
#to_char(sysdate(),’YYYY’)
#to_char(sysdate(),’HH’)
#to_char(sysdate(),’MI’)
#to_char(sysdate(),’SS’)
#to_char(sysdate(),’FF’)

print(To_date('jun 18,2012', 'MON DD,YYYY'));

print(CAST('13.54','INT'));
print(CAST('12.20','DECIMAL(3,1)'));

print( ltrim('Marilyn', 'Ma'));
print( ltrim('ABCABCD', 'ABC')); 

print(ceil(12.12345));
print(ceil(-12.223)); 

String functions
02:24
+
Managing Slowly Changing Dimensions in BODS
10 Lectures 52:51
Types of slowly moving dimensions
03:42

Setting up a full CDC solution within Data Services may not be requ ired. Many databases now
have CDC support built into them. such as Oracle. SQL Server, DB2 and SAP Sybase.
Alternatively, you can combine surrogate keys with t he Map Operation transform to change all
UPDATE row types to INSERT row types to capture changes.

Source CDC vs Target CDC Methods
06:34

Source-based Changed Data Capture (CDC) is t he preferred method of updating data because it
improves performance by extracting the fewest rows. Source-based CDC, also referred to as
incremental extraction. extracts only the changed rows from the source as shown in the figure

Using Source-Based CDC Concept - DEMO
14:26

Exercise 12
00:01

Target-based Change Data Capture (CDC) compares the source to the target to determine
which records have changed.

Using Target-Based (CDC) Concept - PART 1 - PREP
08:09

Target-based Change Data Capture (CDC) compares the source to the target to determine
which records have changed.

Using Target-Based (CDC) Concept - PART 2 -DEMO
10:23

Converts rows flagged as UPDATE to UPDATE plus INSERT. so
that the original values are preserved in the target. Specify the
column in which to look for updated data.

Using Target based (CDC) - History Preserving Demo
03:36

Exercise 13
00:01

Use the Pivot transform to convert columns into rows and to
convert rows back into columns.

Using the Pivot Transform
05:56

Exercise 14
00:02
+
Batch Job Troubleshooting
6 Lectures 10:30

 Anotations is  a sticky-note with folded-down corner

Tracing job - Use trace properties to select the information that Data Services monitors and writes to the
 trace log file during a job. Data Services writes trace messages to the trace log associated with the
current job server and writes error messages to the error log associated with the current job server

Annotations and Tracing Jobs
03:25

Exercise 15
00:02

Use the Interactive Debugger to examine what happens to data after each transform or object in
a flow and to troubleshoot any issues that arise when executing your jobs.

Debugging Data Flows - using Debugger
02:33

Exercise 16
00:01

Learn to set up audit rules to ensure the correct data is loaded to the target when executing jobs

Auditing Data Flows
04:27

Exercise 17
00:01
+
Error Handling
3 Lectures 14:41

Resolve issues if a Data Services job execution is not successful . for example, if a server failure
prevents the completion of the job. Use recoverable work f lows and try/catch blocks to recover
data for sophisticated error handling.

Setting Up Error Handling Implementation Concept
05:54

Resolve issues if a Data Services job execution is not successful . for example, if a server failure
prevents the completion of the job. Use recoverable work f lows and try/catch blocks to recover
data for sophisticated error handling.

Setting Up Error Handling Implementation Demo
08:45

Exercise 18
00:02
About the Instructor
Junaid Ahmed
4.2 Average rating
1,154 Reviews
7,870 Students
7 Courses
Entrepreneur, Product Designer, Architect and Trainer

Hello there! I am Junaid Ahmed. I have a bachelors in electrical and electronics engineering and a masters in software engineering . I would like to call my self a technologist, I am excited to learn new technologies, working and teaching on them.

I come from a background of enterprise software, and I am currently also focusing on Internet of things(IOT) and building products. I have worked for over 10+ years with enterprise applications, reporting and security products from SAP and Oracle. I have consulted with large organizations in Government, Telecom, Power, FMCG sectors and start-ups.I manage projects using agile methods and tools. I hold certifications in SAP HANA and scrum master . As far as training goes I have been involved in on site and online Training for the last 6 years in different products and practices including SAP HANA, BODS, BOBJ Suite, soft skills and IOT. I have over 6000+ students enrolled in my courses on Udemy. My clients include Consulting Companies, Implementation partners and Consultants. Currently, I have also started working with firms to provide solution in the IOT space.

Happy Learning!