Tag Archives: SAP BODS Free Demo Classes

SAP BODS Online Training

SAP BODS Online Training Course Details

SAP Business objects data services is an ETL tool used for data marts, ODS system, etc. Ensure complete and accurate information with enterprise class solution for data integration and data quality. User-define standards are published in a custom Cleansing Package which is usable by Data Services through the Data Cleanse transform. SAP BODS Certification Online Training enhances to move data from any applications into SAP Business suite deployment and master the data management.

SAP BODS Certification Training Overview

Spiritsofts main objective is to providing SAP BODS 4.x training to enhance administration, performance tuning and metadata management. Understand the requirements of learner and sessions are designed to module accordingly by 11+ years of real time experienced IT Professionals with hands-on.

SAP BODS Online Training Course Content

Introduction to BODS
  • Place of BODS among modern DI software package
  • Competitive advantage of BODS
  • ETL and ELT approach
  • How both approaches can be combined in one tool
  • Data Services components and architecture
Basics of BODS
  • Concepts of Job, Workflow and Data Flow
  • Concepts of Datastore
  • Creating basic Batch Job
  • Transforms overview
Data profiling and Query Transform
  • Analyzing data sources
  • Joining data from multiple sources
  • Basic transformations and build-in functions
  • Aggregating data in BODS
  • Lookup
Working with Files
  • Defining file formats for file data source
  • Working with XML files(XSD,DTD)
  • COBAL COPY BOOKS
  • File Formats as Source and as Targets
Platform Transforms Overview
  • Case and Merge transforms
  • SQL transform
  • Validation transform
  • Row Generation
  • Map Operation
  • Combining Validation and Auditing – best practices
Data Integrator Transforms
  • Working with hierarchical data
  • Hierarchy Flattening transform
  • Merging data
  • Table Comparison transform
  • Other ways of merging data for RDBMS datastores
  • Slowly changing dimension support in BODS
  • History preserving transform
  • Using built-in functions
  • Use date and time functions and the date generation transform to build a dimension table
  • Use the lookup functions to look up status in a table
  • Use match pattern functions to compare input strings to patterns
  • Use database type functions to return information on data sources
Metadata Management
  • Local and Central repositories
  • Working in a group
  • Propagating Job from development to staging and production environment – best practices
  • Backing up metadata
  • ATL file. Export\Import operations
  • Validating, tracing, and debugging jobs
  • Use description and annotations
  • Validate and trace jobs
  • Use view data and the Interactive Debugger
Management Console
  • Administrating BODS
  • Overview of all major tools
  • Execution of a Job from server
  • Scheduling a job
  • Auto Documentation reports
  • Designing ETLs with auto documentation in mind
BODS Scripting Language
  • Variables in BODS – Scope of variable
  • Scripting language – Advantages and limitations
  • Custom functions in BODS
  • Creating simple function
  • Database procedures
  • Custom functions vs. database functions
SAP BW 7.5 AND SAP ECC, SAP CRM BODS INTEGRATION
  • Connecting to SAP Applications
  • ABAP data Flow processing Using Dataservices
  • IDoc interface connectivity Using Dataservices
  • SAP Application table source in real jobs using Dataservices
  • Executing Batch jobs that contain ABAP Dataflows
  • Connecting to SAP Net weaver Business warehouse
  • Using Dataservices in SAP NW BW environments,Openhub tables
  • Defining SAP Net Weaver BW Datastores Using Data services
  • Loading into SAP NW BW using Dataservices
HANA 1.0 SPS8 BODS INTEGRATION
  • BODS 4.2 with HANA to builg Agile Datamarts using Diffrennt Transforms and File systems
Error Handling in BODS
  • Validating Data Flow, Workflow, Job
  • Error handling on Workflow level – Try and Catch objects
  • Analyzing execution of Batch Job – Log files
  • Debugging in BODS
  • Handing execution errors – best practices
Performance optimization in BODS - Part 1
  • Push-down concept
  • Cases when push-down doesnít work
  • Data Transfer transform
  • Cashing in BODS
  • Defining cashing for lookups and Table Comparison
Performance optimization in BODS - Part 2
  • Degree of parallelism
  • Designing custom function for parallel execution
  • Bulk loading operation
  • Using MultiLoad for Teradata targets
  • Execution parameters
  • Rows per commit
  • Number of loaders
Using BODS Metadata
  • Access to BODS metadata
  • Operational metadata
  • Auditing metadata
  • Creating your own metadata by using BODS metadata
Possible Alternative Topics
  • Data Quality in BODS
  • SAP and SAP BODS / BODI
  • Real-time job
  • Integration with other applications
  • Automatic job creation from template

SAP BODI/BODS Online Training

SAP BODI/BODS Online Training Course Content

Introduction to Data Warehouses, Data Marts and ETL Process 1 Hour and 30 Mins
  • Operational systems and their characteristics
  • Analytical systems (or DWHs) and their characteristics
  • Data Marts
  • Comparison of (a) Operational systems and Analytical systems and (b) Data Warehouses and Data Marts
  • Basic Data Warehouse Architecture – various layers
  • Facts and Dimensions – An example Star schema
  • Slowly Changing Dimensions – Type 1, Type 2 and Type 3 with examples
  • What is ETL – Extraction, Transformation and Load
  • Extraction types – Full and Incremental
  • Load types – Full and Incremental
Introduction to BO Data Services 45 Mins
  • Standard Data Services Achitecture – Designer, Repository, Job Server, Engine, Access Server, Administrator, Web Server and Service
  • Basic Data Services Components – Designer, Server Manager, Repository Manager and Management Console
  • Data Services Objects and Object Hierarchy
Preliminary workouts with Data Services 30 Mins
  • Create a local respository using Repository Manager
  • Configure a job server using Server Manager
  • Associate repository to the Job server
  • Connect repository to the Administrator
Working with Designer
  • Logging into the Designer 15 Mins
  • Walk through of Designer User Interface 15 Mins
  • Create and execute a simple Batch Job 3 Hours
    • Create DataBase DataStore and import metadata
    • Create and edit File formats (.txt, .csv, .xls), File format features
    • Create Data Flow, Work Flow, Job and Project
    • Execute the job and monitor its progress
  • 4.4 Built-in Transforms – All important DI, DQ and Platform Transforms 8 Hours
  • 4.5 DS Scripting Language and working with Functions 4 Hours
    • Script and the scripting language
    • Built-in functions (Important functions such as exec, ifthenelse, decode, is_valid_date, lookup, lookup_ext, lookup_seq, nvl, and sql
      etc.)
    • Custom functions, Database functions and stored procedures
  • 4.6 Other important Objects 2 Hours
    • Conditional
    • While loop
    • Try
    • Catch
    • Annotation
  • Working with variables and parameters 1 Hour
      • Global variables
      • Local variables

    Page 1
    ID S. No. Topic Duration

  • Parameters
  • Working with nested data – extracting data from XML sources 1 Hour
  • Debug features 1 Hour
    • using “View Where Used”
    • using “View Data”
    • using the interactive debugger
  • Working with Full sources, Time stamped sources and CDC sources 4 Hours
    • Implementing type 1 SCD – Full extraction
    • Implementing type 2 SCD – Full extraction
    • Implementing type 1 SCD – Incremental extraction (Time stamped source)
    • Implementing type 2 SCD – Incremental extraction (Time stamped source)
    • Implementing type 1 SCD – Incremental extraction (CDC Source)
    • Implementing type 2 SCD – Incremental extraction (CDC source)
    • Benefits of Incremental extraction over full extraction
Working in multi user environment 1 Hour
  • Setting up multi user environment – Creating, defining a connection to and activating Non-secure and Secure central repositories
  • Accessing central repository from a local repository – Adding objects, Checking out objects, Undoing check out, Checking in objects,Filtering, Getting objects, Labeling objects, Comparing objects, Viewing object history and Deleting objects
Code migration from one phase to another phase of a project 1 Hour
  • Using Export/Import method
    • ATL method
      • Exporting objects to file (.ATL)
      • Exporting whole repository to file (.ATL)
      • Importing from .ATL file
    • Directly exporting objects from one repository to another repository
  • Using central repository method
Processing data with problems 1 Hour
  • Using overflow files
  • Pass bad data to a database table for further analysis and action
Recovering from unsuccessful job execution 30 Mins
  • Automatic Recovery
  • Manual Recovery
DS Management Console
  • Administrator 2 Hours
    • Walk-through of Administrator UI
    • Management – Adding repositories, Managing user roles and Setting the log retention period etc.
    • Central repository users and groups management
    • Server group – Achitecture, Executing jobs using SG and the advantage of load balancing
    • Executing, scheduling and monitoring batch jobs
    • Troubleshooting – Understanding error and trace logs
    • Use of Export execution command option
  • Using Impact and Lineage Analysis reports 15 Mins
  • Using Operational Dashboards 15 Mins
    Page 2
    ID S. No. Topic Duration
  • Using Auto Documentation 15 Mins
Loading a tiny DWH
Loading a tiny DWH – Customer Dim from relational sources, Location Dim from flat file sources, Product Dim from xml sources, Time Dim from relational sources and Revenue Fct from relational sources; Managing dependencies between jobs.
3 Hours
Performance Tuning 4 Hours
  • 11.1 Using DB Indices properly
  • Maximize Push-Down using Data_Transfer transform and pushdown_sql function
  • Using proper caches
    • Sources – In-memory or Pageable
    • Lookups – NO_CACHE, PRE_LOAD_CACHE and DEMAND_LOAD_CACHE
    • Table Comparisons – ROW_BY_ROW_SELECT, CACHED_COMPARISON_TABLE and SORTED_INPUT
  • Maximize Parallel execution where possible using Table partitioning and DOP option
  • Splitting data flow into sub data flows using “Run as separate process” option
  • Using Bulk Loading option
  • Using Join ranks and array fetch size options
  • Using Rows per commit option
A brief view of repository tables
ID S. No. Topic Duration
Top ↑