• /
  • Courses


50 hours

Course Price

$ 349.00

4.5 (23)


Course Content

1. Introduction

  • SAP Business Objects Data Services /Data Integrator Introduction
  • Overview of DataWarehousing
  • What is ETL?

2. DW Concepts

  • Definition
  • DW Purpose
  • DW Challenges
  • Typical DW Architecture

3. Architecture of BODS 3.0

  • Recap
  • DI Designer
  • Repository Manager
  • Server Manager
  • Web Administrator
  • License Manager

4. BODS Objects

  • Objects Hierarchy
  • Project
  • Job
  • Work flow
  • Data flow
  • Data store
  • Formats

5. Transforms

  • Query Transform
  • SQL Transform
  • Effective_Date
  • Case Statement
  • Table_Comparision
  • Pivot (Columns to Rows)
  • Reverse Pivot (Rows to Columns)
  • Hierarchy_Flattening
  • Scripts

6. Variables and Parameters

  • Variables
  • Data Cleansing

7. Optimization

  • Environment
  • Bulk Loading
  • Optimization
  • Parallel Execution
  • Quiz

8. Assignments

  • Practicse session

Trainer Profile

Interview Questions & Answer


1) What is the use of BusinessObjects Data Services?

 BusinessObjects Data Services provides a graphical interface that allows you to easily create jobs that extract data from heterogeneous sources, transform that data to meet the business requirements of your organization, and load the data into a single location.


2) What are the steps included in Data integration process? 

  • Stage data in an operational data store, data warehouse, or data mart.
  • Update staged data in batch or real-time modes.
  • Create a single environment for developing, testing, and deploying the entire data integration platform.
  • Manage a single metadata repository to capture the relationships between different extraction and access methods and provide integrated lineage and impact analysis.

 3) What is an Embedded Dataflow?

 An Embedded Dataflow is a dataflow which is called from inside another dataflow.


4) Define the terms Job, Workflow, and Dataflow?

  • A job is the smallest unit of work which  you can schedule independently for execution.
  • A workflow defines the decision-making process for executing data flows.
  • Data flows extract, transform, and load data. Everything having to do with data, including reading sources, transforming data, and loading targets, occurs inside a data flow.


5) What is the difference between a data store and a database?

 A datastore is a connection to a database.


6) What is a transform?

 A transform enables you to control how datasets change in a dataflow.


7) What is a Script?

  It is a single-use object that is used to call functions and assign values in a workflow.


8) How Do You Check The Execution History Of A Job Or A Data Flow?

DS Management Console → Job Execution History


9) What is text data processing transformation?

It allows you to extract the specific information from large volume of texts. You can search for facts and entities like Customer, Product and Financial facts specific to an organisation. This transformation also checks the relationship between entities and allows the extraction.

The data extracted using text data processing can also be used in Business intelligence, Reporting, Query and Analytics.


10) List the Data Quality Transforms?

  1. Global_Address_Cleanse
  2. Data_Cleanse
  3. Match
  4. Associate
  5. Country_id
  6. USA_Regulatory_Address_Cleanse


11) What is the use of Query Transformation?

Query transformation is most common transformation used in Data Services and you can perform below functions:

  1. Data filtering from sources
  2. Joining data from multiple sources
  3. Perform functions and transformations on data
  4. Column mapping from input to output schemas
  5. Assigning Primary keys
  6. Add new columns, schemas and functions resulted to output schemas
  7. As Query transformation is most commonly used transformation, so a shortcut is provided for this query in tool palette


12) What Are Cleansing Packages?

Cleansing packages are packages that enhance the ability of Data Cleanse to accurately process various forms of global data by including language-specific reference data and parsing rules.


13) What Is Data Cleanse?

The Data Cleanse transform identifies and isolates specific parts of mixed data, and standardizes your data based on information stored in the parsing dictionary, business rules defined in the rule file, and expressions defined in the pattern file.


 14) List factors you consider when determining whether to Run Work Flows Or Data Flows Serially Or In Parallel?

Consider the following:

  1. Whether or not the flows are independent of each other
  2. Whether or not the server can handle the processing requirements of flows running at the same time (in parallel)




Register For Online Demo

Can't read the image? click here to refresh