Tag Archives: SAP BODS FAQs

SAP BODS Online Training

SAP BODS Online Training Course Details

SAP Business objects data services is an ETL tool used for data marts, ODS system, etc. Ensure complete and accurate information with enterprise class solution for data integration and data quality. User-define standards are published in a custom Cleansing Package which is usable by Data Services through the Data Cleanse transform. SAP BODS Certification Online Training enhances to move data from any applications into SAP Business suite deployment and master the data management.

SAP BODS Certification Training Overview

Spiritsofts main objective is to providing SAP BODS 4.x training to enhance administration, performance tuning and metadata management. Understand the requirements of learner and sessions are designed to module accordingly by 11+ years of real time experienced IT Professionals with hands-on.

SAP BODS Online Training Course Content

Introduction to BODS
  • Place of BODS among modern DI software package
  • Competitive advantage of BODS
  • ETL and ELT approach
  • How both approaches can be combined in one tool
  • Data Services components and architecture
Basics of BODS
  • Concepts of Job, Workflow and Data Flow
  • Concepts of Datastore
  • Creating basic Batch Job
  • Transforms overview
Data profiling and Query Transform
  • Analyzing data sources
  • Joining data from multiple sources
  • Basic transformations and build-in functions
  • Aggregating data in BODS
  • Lookup
Working with Files
  • Defining file formats for file data source
  • Working with XML files(XSD,DTD)
  • COBAL COPY BOOKS
  • File Formats as Source and as Targets
Platform Transforms Overview
  • Case and Merge transforms
  • SQL transform
  • Validation transform
  • Row Generation
  • Map Operation
  • Combining Validation and Auditing – best practices
Data Integrator Transforms
  • Working with hierarchical data
  • Hierarchy Flattening transform
  • Merging data
  • Table Comparison transform
  • Other ways of merging data for RDBMS datastores
  • Slowly changing dimension support in BODS
  • History preserving transform
  • Using built-in functions
  • Use date and time functions and the date generation transform to build a dimension table
  • Use the lookup functions to look up status in a table
  • Use match pattern functions to compare input strings to patterns
  • Use database type functions to return information on data sources
Metadata Management
  • Local and Central repositories
  • Working in a group
  • Propagating Job from development to staging and production environment – best practices
  • Backing up metadata
  • ATL file. Export\Import operations
  • Validating, tracing, and debugging jobs
  • Use description and annotations
  • Validate and trace jobs
  • Use view data and the Interactive Debugger
Management Console
  • Administrating BODS
  • Overview of all major tools
  • Execution of a Job from server
  • Scheduling a job
  • Auto Documentation reports
  • Designing ETLs with auto documentation in mind
BODS Scripting Language
  • Variables in BODS – Scope of variable
  • Scripting language – Advantages and limitations
  • Custom functions in BODS
  • Creating simple function
  • Database procedures
  • Custom functions vs. database functions
SAP BW 7.5 AND SAP ECC, SAP CRM BODS INTEGRATION
  • Connecting to SAP Applications
  • ABAP data Flow processing Using Dataservices
  • IDoc interface connectivity Using Dataservices
  • SAP Application table source in real jobs using Dataservices
  • Executing Batch jobs that contain ABAP Dataflows
  • Connecting to SAP Net weaver Business warehouse
  • Using Dataservices in SAP NW BW environments,Openhub tables
  • Defining SAP Net Weaver BW Datastores Using Data services
  • Loading into SAP NW BW using Dataservices
HANA 1.0 SPS8 BODS INTEGRATION
  • BODS 4.2 with HANA to builg Agile Datamarts using Diffrennt Transforms and File systems
Error Handling in BODS
  • Validating Data Flow, Workflow, Job
  • Error handling on Workflow level – Try and Catch objects
  • Analyzing execution of Batch Job – Log files
  • Debugging in BODS
  • Handing execution errors – best practices
Performance optimization in BODS - Part 1
  • Push-down concept
  • Cases when push-down doesnít work
  • Data Transfer transform
  • Cashing in BODS
  • Defining cashing for lookups and Table Comparison
Performance optimization in BODS - Part 2
  • Degree of parallelism
  • Designing custom function for parallel execution
  • Bulk loading operation
  • Using MultiLoad for Teradata targets
  • Execution parameters
  • Rows per commit
  • Number of loaders
Using BODS Metadata
  • Access to BODS metadata
  • Operational metadata
  • Auditing metadata
  • Creating your own metadata by using BODS metadata
Possible Alternative Topics
  • Data Quality in BODS
  • SAP and SAP BODS / BODI
  • Real-time job
  • Integration with other applications
  • Automatic job creation from template

SAP BODS Interview Questions

SAP BODS Interview Questions and Answers Prepared by Real time Experts, Learn SAP BODS Online Training Classes SAP BODS Training Cerification Material Best Institute in Hyderabad USA Canada and Australia

SAP BODS Interview Question And Answers:

Q.What is the use of BusinessObjects Data Services?

BusinessObjects Data Services provides a graphical interface that allows you to easily create jobs that extract data from heterogeneous sources, transform that data to meet the business requirements of your organization, and load the data into a single location.


Q. Define Data Services components.
Data Services includes the following standard components:

  • Designer
  • Repository
  • Job Server
  • Engines
  • Access Server
  • Adapters
  • Real-time Services
  • Address Server
  • Cleansing Packages, Dictionaries, and Directories
  • Management Console [sociallocker]

Q. What are the steps included in Data integration process?

  • Stage data in an operational datastore, data warehouse, or data mart.
  • Update staged data in batch or real-time modes.
  • Create a single environment for developing, testing, and deploying the entire data integration platform.
  • Manage a single metadata repository to capture the relationships between different extraction and access methods and provide integrated lineage and impact analysis.

Q. Define the terms Job, Workflow, and Dataflow

  • A job is the smallest unit of work that you can schedule independently for execution.
  • A work flow defines the decision-making process for executing data flows.
  • Data flows extract, transform, and load data. Everything having to do with data, including reading sources, transforming data, and loading targets, occurs inside a data flow.

Q. Arrange these objects in order by their hierarchy: Dataflow, Job, Project, and Workflow.

  • Project, Job, Workflow, Dataflow.
  • Top ↑