Databases and their WebApps/Applications
Sometimes users need a list of the databases in which they are working and the WebApps and applications for which those databases are used. The following is a general overview of delivered DSP® applications and their uses.
To produce a list of databases and their applications (including Custom WebApps), run this query in SQL Server Management Studio (SSMS).
Applications installed with the DSP (excludes Custom WebApps):
- Advanced Data Migration (previously dspMigrate)
- Agent Interface
- Assemble
- AutoGen
- Automate
- Collect
- Common
- DSP Add-Ons
- Mass Maintenance (previously dspCompose)
- Master Data Management (previously dspConduct)
- Data Quality (previously dspMonitor)
- dspTrack
- Information Steward Accelerator
- Integrate
- Security Migration App
- System Administration
Agent Interface
The Migration dashboard, visible in the Syniti Knowledge Tier, provides visibility into progress and status associated with enterprise Data Migration efforts by giving you access to the most important migration project metrics. The Agent Interface facilitates communication between the DSP and the Syniti Knowledge Tier to pass the data to this dashboard.
Assemble
Assemble is an import and export tool that allows for the creation and execution of packages that transfer data between two sources. It supports the importing and exporting of several source and file types: SQL Server, Oracle, ODBC/OleDB databases, fixed width and delimited text files and Excel spreadsheets. Packages can be created and executed for any version of Microsoft SQL Server. It can be utilized independently or in conjunction with other applications including Collect, Automate and Transform.
AutoGen
SQL AutoGen
SQL AutoGen creates basic objects used by the migration process (tables, rules and reports) so that users can focus on addressing complex requirements.
All objects created using SQL AutoGen are written to the data source defined for the object in Console (Process Area > Objects > Vertical View > Data Source ID).
Data Services AutoGen
Data Services AutoGen is driven by Target Design in Advanced Data Migration. Data Services AutoGen provides the ability to generate Data Services Jobs that perform target enrichments and validations, including post load validation of the data loaded into the target application.
Using the AutoGen request options in the DSP, users send the metadata required to generate the Data Services Jobs for the specific target to the dspCloud™ for processing. Data Services Jobs are created based on Target Design. Once generated, the Jobs are registered in Transform as Data Services rules.
As the Jobs created in Data Services are based on the design of the target system and the rules applicable to it, the Jobs can be reused across multiple migration Waves and also post go live for data governance purposes for interfaces, data transfers or further migrations (acquisitions / mergers)
.Automate
Automate, a middleware tool, is a DSP WebApp that allows automated processing of tasks. An interface can be designed using events, stored procedures, workflows, etc. to manage a complete business process. The interface can then be scheduled to run automatically during off hours.
For example, Automate could connect various corporate systems with SAP and allow data to flow smoothly between them based on exact specifications. An Interface Designer can rapidly deploy new interfaces or can modify the existing interfaces without weeks of designing and programming.
Automate uses ANSI Standard SQL as its ETL language. It supports FTP, workflow and interactive web pages for managing data repair and error handling.
Automate can support any native DSP response capability including:
- Stored Procedures
- Workflow
- Assemble (formerly known as “CranPort”)
Collect
Collect is a data repository management component that is a core entity of many DSP product offerings. Collect maintains a unified collection of data from multiple, disparate systems without requiring users to connect to the actual systems. It can be configured to extract the data at scheduled intervals and at an individual table level. In this way, the data can be kept in sync based on the processing schedule of each data source. Collect also contains import groups to organize and manage sets of tables.
Common
Common is a collection of utilities used throughout the DSP and Advanced Data Migration components, including those that track rate of change and perform data encryption, data profiling, duplicate detection and scheduling.
Console
Before beginning work on a migration project, the hierarchical migration structure must be set up in Console so it can be passed in to each component. Waves, Process Areas and Objects must be added in Console, which will pass them to the other Advanced Data Migration components. For example, a Wave is registered once in Console, and then passed to Map and Transform. Elements should also be deleted in Console, but cannot be deleted if they are used in a migration project in Map or Transform.
Construct
Construct is the component of Advanced Data Migration that assists in creating, enhancing and converting data to load into the Target system. Construct eliminates the inconsistency of maintaining multiple spreadsheets and provides a single location where users can update or directly enter data. Construction pages are used to enter or modify data required by the ERP system that doesn’t currently exist in the Source system.
DSP Add-Ons
An Administrator can use DSP Add-Ons to create custom security for delivered WebApps or custom WebApps. By creating Custom WebApp groups, users can easily customize access for pages depending on any organizational need. For example, WebApp groups could be created for departments, to group pages that are used to perform certain tasks, or to grant access to a page to a limited number of users.
Mass Maintenance (previously dspCompose™)
Mass Maintenance is a generic governance engine used with SAP Master Data Governance (MDG). It provides industry-specific content and workflows to initiate process governance scenarios for data domains not already handled by SAP MDG, such as Human Capital Management (HCM), Bill of Materials (BOMs), and routings. This strategy results in reduced development cycle time and faster achievement of master data objectives while fully leveraging SAP MDG.
Mass Maintenance is a workflow-enabled application to govern the entry, review and approval of a proposed mass change. Users are assigned to roles, which are associated with templates. A template aligns with a single BDC or GUI script, an Integrate template, or a custom template, and can be reused for multiple requests. A request drives the workflow process to mass change a single object in SAP.
Highly customizable, Mass Maintenance can be configured to control user access to request data and how a request is processed from request data entry through review and posting to a target system.
A change can be based on a database view utilizing a Where clause, an Excel spreadsheet, or manual input by the user. BDC and GUI scripts can be used to update data in a target ERP system such as SAP. Messages are returned from the target system to show which objects were successfully updated.
Mass Maintenance supports mass change data to be posted to SAP via Integrate, a Syniti component used as the posting mechanism and script repository. Integrate contains all the commands for every BDC and GUI script used to load data into SAP. Each script represents a mass update process. Refer to Integrate for detailed information.
dspCompose_Data
When Mass Maintenance (dspCompose) is installed, the WebApp dspCompose_Data and its database are also installed. dspCompose_Data is the configured WebApp and database where client customizations and data validations should be created that will not be overwritten during an upgrade. In addition to dspCompose_Data, other custom WebApps and companion databases may be created and used with Mass Maintenance. This application enables supporting data that is used by Mass Maintenance to be maintained via the application rather than the backend.
Master Data Management (previously dspConduct™)
Master Data Management (dspConduct) provides an Application Data Management solution for governing data in business processes across disparate applications and infrastructures.
MDM provides the ability to design, execute, and monitor business processes within an organization as they relate to the creation and maintenance of data within the enterprise architecture of an organization. A Designer creates the governance elements hierarchy, where tasks, roles, scenarios, business processes and their dependencies are defined.
MDM enables users to manage the creation of master data. Master data is collected, validated, reviewed, approved and posted through a request process that MDM provides. A request is comprised of tasks, roles, and scenarios within a business processes. The processes are grouped by category (e.g., Material, Customer, Vendor). Users of MDM can create their own custom applications and request pages for any object necessary. The Content WebApp is registered at the category level within MDM.
MDM Content Configuration
When Master Data Management (dspConduct) is installed, the WebApp Content Configuration and its database are also installed. Content Configuration is the configured WebApp and database where client customizations and data validations should be created that will not be overwritten during an upgrade. In addition to Content Configuration, other custom WebApps and companion databases may be created and used with Master Data Management. This application enables supporting data that is used by MDM to be maintained via the application rather than the backend.
Data Quality (previously dspMonitor™)
Data Qualiy is the analytical component of the DSP that facilitates reporting on and maintaining high quality master data. It allows business users to:
-
Pull data from the ERP database on a scheduled or an on-demand basis to constantly review data, especially data that could cause a significant business process interruption
-
Establish data quality thresholds and view data quality scores and status
- Organize and display quality metric reports via web access or other formats for process improvement
-
Identify and send errors (via workflow notifications) to business users for cleansing
Data Quality is delivered with a standard set of reports. Users can also create custom reports to fit their business needs.
DQ Construction
Data Quality Construction provides the ability to rapidly create and maintain a web-based user interface to manage cross-reference table(s) and to control parameters dynamically within the application rather than hard code those values within the code of the report. The report is built once. If the report requires updates (for example, if parameters change or values need to be added), a business user does not need a developer to change the underlying code of the report. Changes can be made directly in the web-based application. These changes are fully auditable.
To edit reports in Data Quality Construction, users must be granted access to the application.
Often within a Data Quality implementation, a business-specific configuration is required that is not maintained within the system of record. For example, a system may deliver over 150 valid Units of Measure, yet only 10 of these values should be business allowable values for the Base Unit of Measure of a Material. The remaining Units of Measure are used for Packaging, Weight, Volume, Sales, or Purchasing; conversion factors; or other needs and therefore should not be options for Base Unit of Measure in the report.
Data Quality Construction allows the developer to create fully auditable web pages for a business users to maintain this configuration through the user interface of the DSP so that the developer does not have to change the underlying code of the report. This process not only enables saving on IT costs but also directly enables the business to make the necessary changes as they are needed and identified by the business, with no development involvement, or changes to estimates, budgets, or funding.
Information Steward Accelerator
ISA extracts data that failed Information Steward (IS) rules, then distributes the data to groups of users. These user groups, called Project Distributions, can receive the data as an Excel file (.xlsx) attached to an email or can view a summary of the rule’s data.
Project Distributions, along with users, have rules. These rules have associated Quality Dimensions by which rules are grouped. Filters can be applied at the user and project distribution level to filter the data sets sent to specific user groups.
Once rules are processed in IS, ISA sends workflow emails to users who are configured to receive them. These emails notify the user of failed records. The email can contain an attachment in .xlsx format with all of the failed records, or can contain a summary table with failed record counts for each rule. The email also contains a link to the associated IS scorecard and back to the DSP platform workflow.
A dashboard is also available in the ISA to visualize the project distribution and user level statistics not available in the IS scorecards. Refer to View the ISA Dashboard for more information.
IS Data Construct
ISA Construction provides the ability to rapidly create and maintain a web-based user interface to manage cross reference table(s), and to control parameters dynamically within the application rather than hard code those values within the code of the SAP Information Steward Rule or View. The IS Rule is built once. If the IS Rule requires updates (for example, if parameters change or values need to be added) a business user does not need a developer to change the underlying code of the IS Rule. The business user can make changes directly in the web-based application. These changes are fully auditable. To edit reports in ISA Construction, users must be granted access to the application.
Often within a Data Quality implementation, a business-specific configuration is required that is not maintained within the system of record. For example, a system may deliver over 150 valid units of measure, yet only 5 –10 of these values are business allowable values for the Base Unit of Measure of a Material. The remaining units of measure are used for Packaging, Weight, Volume, Sales, or Purchasing; conversion factors; or other needs and therefore should not be options for Base Unit of Measure.
ISA Construction allows the developer to create fully auditable web pages for a business user to maintain this configuration through the user interface of the DSP so that the developer does not have to change the underlying code of the IS Rule. This process not only enables saving on IT costs, but also directly enables the business to make the necessary changes as they are needed and identified by the business, with no development involvement, estimates, budgets, or funding.
Integrate
Integrate is a component of the DSP that contains loading mechanisms to push data into an ERP system.
Integrate loads data to SAP using:
- Batch Data Communication (BDC) processing
- Graphical User Interface (GUI) scripting
- A Remote Function Call (RFC) or a BAPI
Integrate can also create and transfer user defined text files formatted as Delimited, Fixed Width, or XML to the SAP server.
Integrate uses standard SAP posting functionality as well as Syniti-delivered custom processing and can be used by any platform component (including the framework) to post data to the target system.
Integrate is organized into templates and processes. A template is based on a template type and defines how data is posted into an ERP system. Templates are not tied to data, but rather act as an independent guide for posting that can be assigned to many processes. A process is a series of posting steps that defines how Integrate takes data from the component and loads it into SAP.
Map
Map is a strategic component of Advanced Data Migration that facilitates the process of mapping for a Target system implementation. Map and Target Design are used to document the design and mapping phases of a migration project. Once Targets and Sources are created in Target Design and the Design is synced with Map, Targets and Sources are pushed to Map where field mapping and value mapping can be performed.
After the mapper has configured a field mapping, it is submitted and must be approved by a Developer on the Mapping Approval page. Refer to Approve or Reject Field Mappings for more information.
Since the mapping process can change at various times during the implementation, Map is a key component within dspMigrate for capturing and tracking requirements, customer decisions and specifications to provide accurate mapping information.
Rules created in Map can then be auto-generated.
Security Migration App
When upgrading from 7.0.6 or below to 7.1 or above, users may need to migrate their security settings to use centralized security. Users of Data Quality (formerly dspMonitor), Master Data Management (formerly dspConduct), and Mass Maintenance (formerly dspCompose) must update security roles when upgrading to 7.1. Refer to the Centralized Security Migration Manual for important information about using security in the DSP in version 7.1 and later. Consult this manual BEFORE updating to 7.1, as an analysis of current security assignments must be completed before the DSP can be updated.
System Administration
System Administration allows administrator users to perform tasks related to user administration and security, CTS, DSP configuration, and catalogs for translations.
Other tasks include:
- Customize Dashboard Page Display
- Log Events and Access to Personal Data
- Register a Data Source
- Add Custom Links to a Page
- Build Indices for a Data Source for Search and Duplicate Detection
- Configure Filters in the DSP
- Manage Failed Jobs
- View Logs
- Manage Duplicate Detection
- Configure Service Pages
- Stop and Start Service Pages
- View Messages
Target Design
Design is a component of Advanced Data Migration used to document tables, fields and Sources used in the migration project. Before Targets can be created in Target Design, a Wave, Process Area and an Object must have been created in Console. Targets can be imported from a database, System Type, or Excel file, or can be added manually.
Transform
Transform is a component of Advanced Data Migration that provides measurable advantages in the Data Preparation phase of a data migration project. The tool cleans, manipulates and reports on data, and exports the data so that it can be loaded into a target ERP system using a load tool.
Query
To produce a list of databases and the applications they support, run the following query in SSMS:
USE CranSoft
SELECT WebApp.WebAppName, DataSource.DataSourceName, DataSource.[Database]
FROM WebApp
INNER JOIN DataSource ON WebApp.DataSourceID = DataSource.DataSourceID
ORDER BY WebAppName