Sample DMS files
AdvancedCleaning.dms
This shows using advanced mrScriptBasic features to iterate through all of the questions in a survey performing different cleaning routines according to the question type. For more information, see
Example 6: Advanced cleaning example.
CaDSCommentBlock.dms
Provides a IBM SPSS Collaboration and Deployment Services comment block for .dms files. The sample defines one input parameter, one output variable, and two output files.
CaseDataOnly.dms
This transfers data from the Employee Data sample IBM SPSS Statistics
.sav file to another
.sav file without specifying a metadata source. For more information, see the fourth example in
InputDataSource section.
Cleaning.dms
This shows some ways of cleaning single response questions in answer to which respondents have selected more than one response. For more information, see
Example 1: More than one response to a single response question.
Cleaning2.dms
This shows using a loop to iterate through all of the single response questions included in the job, testing whether they have more than one response, and if they do, writing the details to a report file and setting the
DataCleaning.Status system variable. For more information, see the first example in
Example 3: More on cleaning single response data.
Cleaning3.dms
This shows how to select the highest and lowest response alternately and use the mrScriptBasic object collection iteration feature to validate all of the iterations of a grid question. For more information, see the second and third example shown in
Example 3: More on cleaning single response data.
CreateCSVAndCategoryMap.dms
This script transfers case data to tab-delimited .csv file, which can be opened in Microsoft Excel. To help understand the data in the .csv file, the script also creates a text file that contains a list of the category values used and the equivalent category labels.
CreateDRS.dms
This writes the case data to a text file in the standard Quancept .drs format. This sample demonstrates a recursive subroutine call, accessing category attributes, dealing with multiple response values, retrieving category labels, and a replace string function. This sample does not export Other Specify texts.
DDFToRDB.dms
This transfers the Museum sample UNICOM Intelligence Data File to a relational MR database. Before you run this example, you need to create a database called
NewMuseum and, if necessary, change the connection string in the OutputDataSource section to reflect the name of the server you are using. For more information, see
Transferring data to a relational MR database (RDB).
DDFToSavHdataTransfer.dms
This script demonstrates how DMOM will transfer case data from a Hierarchical dataset to a flattened dataset when the input data source query does not contain any Levels variables (grid/loop).
DDFToSavHdataTransferExpressionAggregation.dms
This script demonstrates DMOM support for expressions and aggregations in an input data source query.
DMSRunExample.dms
This script demonstrates the use of text substitution in a DMS file when running the file using the DMS Runner command prompt utility. Like the sample file Simple.dms, this script transfers the Museum sample UNICOM Intelligence Data File to a new IBM SPSS Statistics.sav file. When you run the script, you must use the DMS Runner /d option to replace the text “Target” with the chosen name for your output data source, and to replace the text “LogSize” with the maximum size of the log file in KB. For example:
DMSRun DMSRunExample.dms /d"Target \"MyOutputDatasource\"" /d"LogSize 250"
For more information, see
DMS Runner.
GetOtherSpecifyText.dms
This provides an example of accessing the open-ended responses to an Other Specify category in the OnNextCase Event section. For more information, see
Example 4: Cleaning other question types.
GlobalSQLVariable.dms
This shows how to use a global SQL variable in a DMS file that is being used to clean or transfer batches of case data and write it to a different data source. For more information, see
GlobalSQLVariables section. You need to run the
GlobalSQLVariableSetUp.dms file before you run this sample to set up the output data source and transfer the first records. You can use the
RunGlobalSQLVariableExample.bat sample batch file to run the two files in sequence: see
Sample batch files.
GlobalSQLVariableSetUp.dms
This sets up the output data source for use by the
GlobalSQLVariable.dms sample. You can use the
RunGlobalSQLVariableExample.bat sample batch file to run the two files in sequence: see
Sample batch files.
IncludeExample.dms
This shows using the
Include1.dms and
Include2.dms Include files. For more information, see
Using include files in the DMS file.
JapaneseMergeHorizontal.dms
This sample demonstrates a horizontal merge of two Japanese (multibyte) data sources. For more information, see
Merging Japanese (multibyte) data sources.
JapaneseMergeVertical.dms
This sample demonstrates a vertical merge of two Japanese (multibyte) data sources. For more information, see
Merging Japanese (multibyte) data sources.
Logging.dms
This shows a Logging section and using the log file to record data cleaning information. For more information, see
Logging section.
MDM2Quantum.dms
This sample uses the Metadata Model to Quantum Component to set up card, column, and punch definitions in the input Metadata Document (
.mdd) file before exporting the case data from the Museum sample UNICOM Intelligence Data File to a Quantum-format .
dat file. This sample also includes code to create the Quantum specification as shown in
Objects in the OnBeforeJobStart Event section. The OutputDataSource section in this sample is in an include file called
QuantumOutput.dms.
MDM2QuantumExtra.dms
This sample uses the
CardColsExtra.dms Include file to set up card, column, and punch definitions for a variable created in the Metadata section, and the
QuantumOutput.dms Include file to define the OutputDataSource section. For more information, see
OnAfterMetaDataTransformation Event section.
MergeHorizontal.dms
This sample shows how to combine the case data from two or more data sources into a single data source. In a horizontal merge, the variables from all the data sources are combined into a single case. For more information, see
Example of a horizontal merge.
MergeVertical.dms
This sample shows how to combine the case data from two or more data sources into a single data source. In a vertical merge, the cases from the second and subsequent data sources are output after the cases from the first data source. For more information, see
Example of a vertical merge.
MetadataOnly.dms
This shows using the Null DSC to operate on metadata only and a simple Metadata section that sets up two simple Boolean variables. For more information, see the third example in
InputDataSource section.
UNICOM Intelligence Interviewer - Server-prefixed sample DMS files
These samples are designed for use with data collected using UNICOM Intelligence Interviewer - Server and are listed separately. For more information, see
Sample DMS files for exporting UNICOM Intelligence Interviewer data.
MS-prefixed sample DMS files
These samples integrate with Microsoft Office applications and are listed separately. For more information, see
Sample DMS files that integrate with Microsoft Office.
MyFirstCleaningScript.dms
This is the sample used in
5. Running your first cleaning script. This sample includes an update query in the InputDataSource section that deliberately introduces some errors into a copy of the Museum XML sample data. These errors are then “cleaned” in the OnNextCase Event section. Without these modifications, the cleaning script would not change the data because the Museum sample data is generally clean.
MyFirstTransfer.dms
NewVariables.dms
This shows some examples of creating new variables in a DMS files, as described in
Creating new variables. This sample also demonstrates using the log file in your OnNextCase Event section error handling.
OnAfterMetaDataTransformation.dms
This shows how to set up card, column, and punch definitions for variables created in the Metadata section and to create the Quantum specification. For more information, see
OnAfterMetaDataTransformation Event section.
OnBadCase.dms
This script shows how to use the OnBadCase event section to write the details of each invalid record to a text file. The script also writes a total count of bad cases to the text file. For more information, see
OnBadCase Event section.
QdiDrsToDDF.dms
This transfers the Museum sample .
qdi and .
drs files to a UNICOM Intelligence Data File (.
ddf). For more information, see
Transferring data from QDI/DRS.
QuantumToDDF.dms
This script transfers data from the “Ski Demo” Quantum sample data supplied with the UNICOM Intelligence Developer Documentation Library to a UNICOM Intelligence Data File (.
ddf). For more information, see
Transferring data from Quantum.
QuantumWeighting.dms
Uses the Weight Component to add a weight to a Quantum output file. The script is similar to the
Weighting.dms sample, but includes an OnAfterMetaDataTransformation event to set up card, column, and punch definitions for all variables. In addition, the variable that is used to store the weight information has a slightly different definition. For more information about weighting, see
Working with the Weight component.
QuanvertPkdToDDF.dms
Provides an example of using a DMS file to export data from a Quanvert packed database to a UNICOM Intelligence Data File (.ddf).
QuanvertToDDFHdataTransfer.dms
This script transfers data from the sample metadata and case data files, supplied with the UNICOM Intelligence Developer Documentation Library, to a UNICOM Intelligence Data File (.ddf) where grid questions are directly transferred.
QuanvertToDDFHdataTransferLevel.dms
This script demonstrates DMOM support for selecting variables from different Levels by up-lev or down-lev.
QuanvertToDDFHdataTransferUnboundloop.dms
This script transfers data from the sample metadata and case data files, supplied with the UNICOM Intelligence Developer Documentation Library, to a UNICOM Intelligence Data File (.ddf) where unbound loop questions are directly transferred.
QuanvertToSav.dms
Creates a Metadata Document (
.mdd) file from a Quanvert database, cleans up some of the Quanvert-specific information, and then transfers data from the Quanvert database to a
.sav file. This sample illustrates using text substitution using the #define syntax and uses the
QuanvertInput.dms Include file. For more information, see
Transferring data from Quanvert.
RDBToQuantum.dms
This transfers the data in the SavToRdb relational MR database (which is the output of the SavToRDB.dms sample) to a Quantum-format ASCII file. It uses the CardCols.dms Include file to set up card, column, and punch definitions in the input .mdd file and create the Quantum specification and the RDBInput.dms and QuantumOutput.dms Include files to define the InputDataSource and OutputDataSource sections respectively. Before you run this sample, make sure the connection string in the InputDataSource section reflects the name of the server you are using.
RDBToSAS.dms
This script transfers a subset of the data in the
NewMuseum relational MR database (which is the output of the
DDFToRDB.dms sample) to a SAS data file. The script also creates a SAS program file, which specifies the user-defined formats for some of the SAS variables and can be used by SAS products to read the data file. For more information, see
Transferring data to SAS.
RDBToSav.dms
This script transfers a subset of the data in the NewMuseum relational MR database (which is the output of the DDFToRDB.dms sample) to a .sav file. The OutputDataSource section is specified using the SavOutput.dms include file. Before you run this sample, make sure the connection string in the InputDataSource section reflects the name of the server you are using.
RDBToTripleS.dms
This script transfers the data in the
NewMuseum relational MR database (which is the output of the
DDFToRDB.dms sample) to Triple-S metadata and case data files. The script creates two Triple-S case data files, one that contains fixed-format fields and another that contains comma-separated values. For more information, see
Transferring data to Triple-S.
SavCustomization.dms
This contains an OnBeforeJobStart Event section that sets up custom properties in the input metadata before exporting the data to a new IBM SPSS Statistics
.sav file. The custom properties customize both the
.sav file and the.
xml file created by the SPSS Statistics SAV DSC. For more information, see
Transferring data to IBM SPSS Statistics.
SavToQuantum.dms
Uses the SavMddInput.dms, CardColsPlus.dms, and QuantumOutput.dms Include files to create an .mdd file from a .sav file, set up the serial number system variable (Respondent.Serial) and card, column, and punch definitions, and export the case data to Quantum.
SavToRDB.dms
Creates a Metadata Document (
.mdd) file from the.sav file in the OnBeforeJobStart Event section, uses the
ShellExecute function to call the CreateRDBDatabase.bat sample batch file. This calls the isql SQL Server command line utility to create a database called SavToRDB. The script then transfers data from the.sav file to the database. Before you run this sample, check that the parameters for the isql command are correct for your installation. If you are using the SQL Server Client Tools, you must change CreateRDBDatabase.bat to call sqlcmd instead of isql. If necessary, change the connection string in the OutputDataSource section of SavToRDB.dms to reflect the name of the server you are using. The OutputDataSource section in this sample is in an include file called RDBOutput.dms. To run this sample, you need SQL Server Client Tools.
SavToDDF.dms
Transfers data from a IBM SPSS Statistics
.sav file to a UNICOM Intelligence Data File (.
ddf) using the SPSS Statistics SAV DSC to read the case data and the metadata. Uses the
SavInput.dms and
DDFOutput.dms Include files. For more information, see
Transferring data from IBM SPSS Statistics.
SavWeighting.dms
Uses the Weight Component to add a weight to IBM SPSS Statistics SAV output files. The script is similar to the
Weighting.dms sample. For more information about weighting, see
Working with the Weight component.
Simple.dms
This transfers the Museum sample UNICOM Intelligence Data File to a new IBM SPSS Statistics
.sav file, as shown in
Example of a DMS file.
SplitIntoDirtyAndClean.dms
This transfers the Short Drinks sample data to two UNICOM Intelligence Data Files (.
ddf), one of which stores the clean data and the other the dirty data. This is achieved using update queries in the two OutputDataSource sections to delete the clean data from the dirty data source and the dirty data from the clean data source. This sample assumes that the Short Drinks sample database has been restored to LocalHost. For more information, see
Filtering data in a DMS file.
STAfSMerge.dms
This script merges coded data that was produced using IBM SPSS Text Analytics for Surveys with the original UNICOM Intelligence Interviewer - Server data to produce a new UNICOM Intelligence Interviewer - Server database. Before running this script, you must first edit the
#define statements in
STAfSMerge.txt and then run
STAfSMerge.mrs. Those two files are installed in the same folder as
STAfSMerge.dms. For more information, see
Merging coded data into UNICOM Intelligence Interviewer.
SurveycraftToSav.dms
Transfers data from the Surveycraft Household sample data set to a new IBM SPSS Statistics
.sav file. The Surveycraft MDSC is used to read metadata from the Surveycraft .
vq file. For more information about reading Surveycraft files into UNICOM Intelligence, see
Surveycraft DSC.
Tables-prefixed sample DMS files
These samples provide examples of scripting tables in a DMS file using UNICOM Intelligence Reporter and are listed separately. For more information, see
Table scripting sample Data Management scripts.
TrackingStudy-prefixed sample DMS files
These samples are used to analyze the response data from a tracking study. For more information, see
Analyzing a tracking study.
TripleSToDDF.dms
This script transfers data from the “Ski Demo” Triple-S sample metadata and case data files supplied with the UNICOM Intelligence Developer Documentation Library to a UNICOM Intelligence Data File (.
ddf). For more information, see
Transferring data from Triple-S.
UseInputAsOutput.dms
Demonstrates using the “use input as output” feature to update the input data source rather than creating an output data source. For more information, see
OutputDataSource section.
UtilizeOuterDirectives.dms
This script demonstrates passing directives to the Data Management Script (DMS) via the IJob.Load function's second parameter. The script cannot be run independently. For more information, see
7: Running a Data Management script.
Weighting.dms
This uses the Weight Component to set up weighting based on equal numbers of male and female respondents, as shown in
Setting up weighting in a DMS file.
WeightingFactors.dms
This uses the Weight Component to set up weighting based on known weighting factors based on age groups and gender. A derived categorical variable is set up in the Metadata Section for the age groups on which the factors are based. A variable usage type of Weight is specified to identify the weight variable to applications like UNICOM Intelligence Reporter - Survey Tabulation.
WeightingTargets.dms
This is similar to the WeightingFactors.dms sample, except that it uses weighting targets rather than factors.
See also