Online Help

1. Overview

What Does In-Memory OLTP Simulator Do?
In-Memory OLTP Simulator enables the IT Professional and Developer to easily test the powerful In-Memory OLTP Engine of SQL Server 2014® (or later) with different simulations and against different workloads. In the Ultimate Edition of the tool, among other, the user can load data from different sources (i.e. from Production DBMS instances using Linked Servers) into the Simulator’s database and simulate real-life scenarios with the use of custom scenarios, as well as get in-depth analytics on performance when utilizing Memory-Optimized Tables and Natively-Compiled Stored Procedures in SQL Server®. In addition to using In-Memory OLTP Simulator for benchmarking and performance analysis, the user can also use it for resource-sizing as it provides real-time statistics on resource usage (i.e. CPU and Memory) when running simulations.

image 
Figure 1.1: In-Memory OLTP Simulator Start Page.


How Does it Work?

The Community Edition (free) of In-Memory OLTP Simulator supports six standard scenarios that are executed against three modes (where available):

  • Disk-Based
  • Memory-Optimized
  • Memory-Optimized with Natively-Compiled Stored Procedure

In Disk-Based execution mode, all tables are stored in disk storage. This is the "traditional" execution mode which can be found in all Relational Database Management Systems (RDBMs). The Disk-Based mode is considered as the baseline for all simulations.

In Memory-Optimized execution mode, all tables are loaded and processed in-memory. Under this mode, memory-optimized tables can be also combined with disk-based tables if this better suits the user’s needs.

In Memory-Optimized with Natively-Compiled Stored Procedure execution mode, the full power of SQL Server® In-Memory OLTP Engine is being utilized. Along with the memory-optimized data structures, a new special type of SQL Server® stored procedure is used. Natively compiled stored procedures are compiled when they are created (in contrast to the disk-based stored procedures where they are compiled at their first execution) thus achieving significantly faster execution.

In-Memory OLTP Simulator undertakes the execution of the scenarios in each mode, monitors the resource usage, logs all statistics and provides rich analytics. Additionally, in the Ultimate Edition of the tool, the user is able to create/modify and import/export custom scenarios as well as get access to multiple processing options and more analytics.

image
Figure 1.2: In-Memory OLTP Simulator Simulation Page.


Available Editions

There are currently two editions of In-Memory OLTP Simulator available:

  • Community Edition: This is the free version of the tool where basic functionality is enabled.
  • Ultimate Edition: This edition contains all the basic functionality plus the ability to create/modify and import/export custom scenarios as well as have access to more analytics. The Ultimate Edition of In-Memory OLTP Simulator can be very useful for organizations or individuals that want to create custom scenarios and simulate real Production data and processes with using In-Memory Optimization in SQL Server®.

image
Figure 1.3: In-Memory OLTP Simulator Simulation Page After Running a Simulation Multiple Times.

For the full list of features available in each Edition of In-Memory OLTP Simulator, please visit the "Features" section on this web page.

Back to Top back to top


2. Getting Started

a. Generate Database or Connect to SQL Instance
    2.a(i). Generate Database
    a(ii). Connect to SQL Instance
b. Select Simulation Scenario for Execution
    b(i). Standard Scenarios
    b(ii). Custom Scenarios
    b(iii). View Active Scenario Definition

c. Initialize Scenario
d.
Run Disk-Based Mode
e. Run Memory-Optimized Mode (if available)
f. Run Memory-Optimized with Natively Compiled SP Mode (if available)
g. View Simulation Statistics

h. Advanced
    h(i). Scenario Manager
    h(ii). Emergency Thresholds
    h(iii). Resource Usage Monitoring
    h(iv). Emergency Actions


[Note 1]:
Never host In-Memory OLTP Simulator’s database on Production SQL Server® instances or Production servers. As the nature of the tool is to benchmark extreme simulations, you would risk the stability of your Production environment if you used it to host In-Memory OLTP Simulator’s database there. Use a Test server instead and in case you want to simulate Production workload you can copy the target data (i.e. via a linked server, SSIS, etc.) to the tables that will be created in Simulator’s database based on the definition of the custom scenario. Custom scenarios can only be created in the Ultimate Edition of In-Memory OLTP Simulator.

[Note 2]:
In case the SQL Server® instance which hosts the Simulator’s database is on a different machine than the one you use for running the Simulator, in the case where you would like to run Standard Scenario 1, you will need to launch In-Memory OLTP Simulator with the "Run as administrator" option. This is required as Scenario 1 needs to be able to write the sample file that will generate in the folder on the database server where the Simulator's database is located. Generally, it is recommended to run In-Memory OLTP Simulator on the database server that will be hosting the Simulator’s database(s).

a. Generate Database or Connect to SQL Instance
The first time you use In-Memory OLTP Simulator you will need to generate the Simulator database. This is the database that will be used for storing the data structures and sample/simulation data for all Simulator's scenarios, including the custom ones.

2.a(i). Generate Database
Via the "Configure" menu, select the "Generate Database" menu option and the below dialog will be displayed through which you can connect to the SQL Server® 2014 (or later) instance and generate the Simulator database:

image
Figure 2.1: Generate Database Dialog.


Generating the Simulator database is a once-off action (*). After the database is generated you need to connect to it prior to doing anything else in the system.

* In the Ultimate Edition of In-Memory OLTP Simulator, you can generate and use multiple databases.

Note: You need to connect to the SQL Instance with SysAdmin privileges.

For more information please access the dedicated help topic “Configure: Generate Database”. 

2.a(ii). Connect to SQL Instance
The first time you use In-Memory OLTP Simulator, as described above, you need to generate its database. After you have generated the database, during this first run, you are automatically connected to it and you can start running simulations.

If the first run of In-Memory OLTP Simulator already took place and its database has already been generated, then you have to connect to the database prior to doing anything else in the system. In order to do that, you need to launch the database connection dialog via the menu options “Configure - Connect to SQL Instance”.

image 
Figure 2.2: Connect to SQL Server® Instance Dialog (Community Edition).

In the Ultimate Edition of In-Memory OLTP Simulator you can connect to multiple simulation databases that were generated from the simulation database generation process.

image 
Figure 2.3: Connect to SQL Server® Instance Dialog (Ultimate Edition).

Note: You need to connect to the SQL Instance with SysAdmin privileges.

For more information please access the dedicated help topic “Configure: Connect to SQL Instance”.


2.b. Select Simulation Scenario for Execution
After connecting to In-Memory OLTP Simulator’s database, among many other functions, you can start running simulations. To this end, you can proceed to either use a standard scenario or in the case you have the Ultimate Edition of the tool, create and use a custom scenario.

2.b(i). Standard Scenarios
In-Memory OLTP Simulator provides six standard simulation scenarios. These are:

  • Import Sample File of 300K Records: This standard scenario imports a sample file of 300K records into a single table in each mode. Its purpose is to simulate how fast a bulk insert can be when using the In-Memory OLTP Engine in SQL Server®. Note: The sample file is auto-generated during the initialization of the scenario and you need to “Run as Administrator” if you are not running In-Memory OLTP Simulator on the database server.

  • Update Single Column in Table of 200K Records: This scenario updates twice a single column in a table of 200K records in each mode. Its purpose is to simulate how fast an update operation can be when using the In-Memory OLTP Engine and Natively Compiled Stored Procedures in SQL Server®.
  • Insert 400K Records with Additional Column: This scenario inserts the 400K records from an existing table into a new table in each mode, along with populating two additional columns with data.

  • Update Multiple Columns in Table of 300K Records: This scenario updates three columns in a table of 300K records in each mode. Its purpose is to simulate how fast a heavy update operation can be when using the In-Memory OLTP Engine and Natively Compiled Stored Procedures in SQL Server®.

  • Multiple Aggregation on Single Table of 200K Records: This scenario is a simulation of multiple aggregations such as COUNT, SUM, MIN and MAX against a table of 200K records. The results are stored in a new table. The original data is retrieved from an original table and is stored in another table on the fly, in each mode, during its transformation with the aggregation functions.

  • Simulation of Sales Statistics Report with Tables of 200K Records: This scenario is a simulation of a simple sales statistics report in each mode. It aggregates data from tables with 200K records and stores it into a new table.


As you can see from the above list, the standard scenarios target different workloads. This allows you to test In-Memory Optimization in SQL Server® in several ways. Additionally, you can change the number of records involved in the standard scenarios from within Scenario Manager in order to run even more variations of the standard scenarios.

For more information on available Standard Scenarios and how to change the number of records, please access the dedicated help topic ”Scenario Manager: Standard Scenarios”.

2.b(ii). Custom Scenarios
If you have purchased the Ultimate Edition of In-Memory OLTP Simulator, in Scenario Manager you can create custom scenarios. This enables you to simulate many real-life scenarios. In-Memory OLTP Simulator guides you through the process of creating a custom simulation scenario where you can generate sample data or get data from any SQL Server® instance via a linked server, store the data in Simulator’s database and process them.

For more information on creating and using custom scenarios, please access the dedicated help topic ”Scenario Manager: Custom Scenarios”.

Also, for a step-by-step example on how to create a custom scenario, please access the dedicated help topic “Special Topics: Creating a Custom Scenario from Scratch”.

2.b(iii). View Active Scenario Definition
In case you want to have a quick view of the definition of the scenario you selected, you can launch the “Active Scenario Definition” dialog from the menu “View – Active Scenario Definition”:

image Figure 2.4: Active Scenario Definition Dialog.

For more information please access the dedicated help topic “View: Active Scenario Definition”.

2.c. Initialize Scenario
After you connect to the Simulator’s database and select the simulation scenario for execution, the next step is to initialize the selected scenario by clicking on the “Initialize Scenario” button in the Simulations Page. When the initialization function is triggered, a series of steps runs on the background in order to prepare the scenario for execution. These are:

  • Create Tables (Tables DDL Scripts)
  • Create and Compile (where applicable) Stored Procedures (Stored Procedures Scripts)
  • Prepare Data for Processing (Data Population Script)
  • Initial Cleaning of Buffers

2.d. Run Disk-Based Mode
The next step is to run the Disk-Based Simulation Mode. This mode of execution is enabled in all standard scenarios and should be always enabled in any custom scenario that the users of the Ultimate Edition of In-Memory OLTP Simulator create, as it is the baseline for each simulation.

The Disk-Based Simulation Mode uses the disk-based tables as data structures and executes the disk-based stored procedure as the data processing logic.

2.e. Run Memory-Optimized Mode (if available)
The next step is to run the Memory-Optimized Simulation Mode (if it is available in the selected scenario). This mode of execution uses the memory-optimized tables as data structures and executes the memory-optimized stored procedure as the data processing logic.

2.f. Run Memory-Optimized with Natively Compiled SP Mode (if available)
The next step is to run the Memory-Optimized with Natively Compiled SP Simulation Mode (if it is available in the selected scenario). This mode of execution uses the memory-optimized tables for natively-compiled stored procedure as data structures and executes the natively compiled stored procedure as the data processing logic.

* Note: Prior to the execution of each simulation run, a Prep Script runs in order to prepare the new run for execution. The Prep Script’s execution time is not included in the measured execution time as it is out of the scope of In-Memory OLTP Simulator’s philosophy. The Prep Script is part of each scenario’s definition. Additionally, prior to running each mode, all buffers are cleaned by default by a built-in process in In-Memory OLTP Simulator in order to ensure that all modes run on an equal base. This is also not taken into consideration in the benchmarking process.

2.g. View Simulation Statistics
In the Simulation Page, besides the steps for running the simulation, there are also two graphs that illustrate the execution times for each mode as well as the different speedups. In addition the winner mode of the simulation is displayed. This information is updated in the case of multiple runs.

Besides the statistics presented on the graphs in the Simulations page, additional information is provided through the Analytics module of In-Memory OLTP Simulator. Under Analytics, you can access “Simulation Statistics” which allows you to see start and end time, execution times, as well as other statistical information about the simulations that took place.

The below screenshot illustrates the Simulation Statistics page for a specific scenario’s execution:

image Figure 2.5: Simulation Statistics Page.

For more information on Simulation Statistics please access the dedicated help topic “Analytics: Simulation Statistics”.

2.h. Advanced
Besides the basic operation of In-Memory OLTP Simulator, there are also available other advanced features that enable you to further control its operation as well as process any workload that you may want to simulate.

2.h(i). Scenario Manager
Scenario Manager is one of the highlights of In-Memory OLTP Simulator. In Scenario Manager you can access all the standard scenarios in the system and in addition to viewing their definition, you can also change the number of records involved in the simulation (up to a limited number of records).

In the Ultimate Edition of In-Memory OLTP Simulator, Scenario Manager provides a unique capability: the ability to create and use custom scenarios. The system guides you through the process of creating custom scenarios where among other you can import data from other sources into the Simulator’s database and thus design processes that simulate real-life workload, running however within Simulator’s database.

image
Figure 2.6: Scenario Manager (Ultimate Edition view).

For more information on Scenario Manager please access the dedicated help topic “Configure: Scenario Manager”.


2.h(ii). Emergency Thresholds
In-Memory OLTP Simulator allows you to set the following two emergency thresholds:

  • Memory Usage (%): This is the maximum allowed memory utilization percentage.
  • CPU Usage (%): This is the maximum allowed CPU utilization percentage.

During a scenario's initialization and execution, the Simulator’s engine constantly checks the memory and CPU utilization of the database server that hosts the Simulator’s database. If one of those metrics reaches the maximum allowed percentage, then the engine of In-Memory OLTP Simulator automatically cancels the scenario's execution and drops the scenario from the Simulator's database. This is done in order to ensure stability of the SQL Server® instance where the In-Memory OLTP Simulator database is hosted on.

image 
Figure 2.7: Emergency Thresholds.


The default Memory and CPU thresholds are set to 70%.
For more information on Emergency Thresholds please access the dedicated help topic “Configure: Emergency Thresholds”.

2.h(iii). Resource Usage Monitoring
In-Memory Optimization in SQL Server® takes full advantage of the available RAM for achieving breakthrough performance. To this end, the need for monitoring the resource utilization on the database server is more necessary as ever. Besides the background resource utilization monitoring that lies in the heart of the engine of In-Memory OLTP Simulator along with use of the Emergency Thresholds for ensuring server stability, In-Memory OLTP Simulator also provides a way of monitoring the current resource utilization in real-time:

image Figure 2.8: Current Resource Usage Statistics.

Via the “Current Resource Usage Statistics” dialog you can take the “pulse” of the following resources on the database server onto which In-Memory OLTP Simulator is connected to:

  • CPU
  • Physical Memory (RAM)
  • Running Processes

For more information on current resource usage monitoring please access the dedicated help topic “View: Current Resource Usage”.

2.h(iv). Emergency Actions
As the nature of In-Memory Optimization involves allocating large amounts of RAM for storing multiple versions of the memory-optimized data, this means that you also need to take into consideration possible hardware limitations of your server when it is used for In-Memory Optimization.

However, as In-Memory OLTP Simulator is an exploratory third-party tool for studying and testing the computational benefits of In-Memory Optimization for certain types of workloads, it offers two types of mechanisms for ensuring that you can cope with possible issues in the case where the memory amount that needs to be allocated for a scenario is larger than the available on your database server. These mechanisms are: (i) Emergency Thresholds and (ii) Emergency Actions for manual intervention by the user. The available Emergency Actions are:

  • Stop Simulation
  • Drop Scenario
  • Drop All Scenarios
  • Take Simulator Database Offline
  • Bring Simulator Database Online

For more information about the Emergency Actions available in the system please access the dedicated help topic “Emergency”.

Back to Top back to top


3. Configure: Generate Database

The first time you use In-Memory OLTP Simulator you will need to generate its database. This is the database that will be used for storing the data structures and sample/simulation data for all Simulator's scenarios, including the custom ones. In order to do this you need to launch the database generation dialog:

image
Figure 3.1: Generate Simulator’s Database Dialog.

You can launch the database generation dialog with any of the following methods:

  • By navigating to “Configure – Generate Database
  • By clicking on the following quick access button: image
  • By using the following key combination: Ctrl + Shift + G

After you enter the necessary credentials in the dialog’s fields and test the connection to the SQL Server® instance, assuming the connection is successful, you can then proceed with the database generation by clicking on the “Generate Database” button.

In-Memory Optimization is currently (at the time of writing this document – June 2015) available in the 64-bit version of the Evaluation, Developer and Enterprise editions of Microsoft SQL Server® 2014. To this end, before generating its database, In-Memory OLTP Simulator validates if the target SQL Server® instance satisfies the above requirements. In the opposite case the program does not generate the database and you cannot continue with the specified SQL Server® instance.

Note 1: In the Community Edition of In-Memory OLTP Simulator you can only generate and use a single simulation database. If you want to be able to generate and use multiple simulation databases you may consider upgrading to the Ultimate Edition of In-Memory OLTP Simulator.

Note 2: You need to connect to the SQL Instance with SysAdmin privileges.

Back to Top back to top


4. Configure: Connect to SQL Instance

Each time you launch In-Memory OLTP Simulator, you have to connect to the Simulator's database (*).

You can launch the database connection dialog with any of the following methods:

  • By navigating to "Configure - Connect to SQL Instance"
  • By clicking on the following quick-access button: image
  • By using the key combination Ctrl + Shift + C

Below you can see the database connection dialog:

 

image 
Figure 4.1: Connect to SQL Server® Instance (Community Edition view).


In the above dialog, after you type/select the SQL Server® instance to connect to, you then need to select the authentication method which can be either "Windows Authentication" or "SQL Server Authentication".

If you select "SQL Server Authentication" you need to provide a username and password that has sysadmin access on the SQL Server® 2014 (or later) instance that will be specified on the "Server name" field and which hosts the Simulator's database.

If you select "Windows Authentication" as the authentication type then your current Windows login will be authenticated against the SQL Server® instance.

Note: The database connection process does not connect to the SQL Server® instance if an In-Memory OLTP Simulator's database is not available and online (previously generated from the Database Generation process). In the case where a database is not present, you need to generate the database from Configure – Generate Database.

* In the Ultimate Edition of In-Memory OLTP Simulator you can generate and connect to multiple Simulation databases. In such case the database connection dialog will present an additional option which lets you choose the simulation database to connect to:

 

image 
Figure 4.2: Connect to SQL Server® Instance (Ultimate Edition view).


Note
: You need to connect to the SQL Server instance with a login that has the SysAdmin server role because via this connection In-Memory OLTP Simulator can generate and use simulation databases, linked servers and other instance-level objects.

Back to Top back to top


5. Configure: Scenario Manager

Scenario Manager is the heart of In-Memory OLTP Simulator. Scenario Manager is the engine that organizes and stores the definitions of all scenarios; standard and custom. It is the engine that allows you to change the number of records for standard scenarios in the Community Edition and create powerful scenarios that simulate heavy processing of data from different sources (i.e. workload from a Production system) in the Ultimate Edition.

 

image 
Figure 5.1: Scenario Manager (Ultimate Edition View).

You can launch Scenario Manager with any of the following methods:

  • By navigating to “Configure – Scenario Manager
  • By clicking on the following quick access button: image
  • By using the following key combination: Ctrl + Shift + M

Back to Top back to top


6. Configure: Scenario Manager – Standard Scenarios

In Scenario Manager, you can find listed all the standard scenarios (in the Ultimate Edition you can view any custom scenarios that were created as well). In addition to viewing the standard scenarios’ definitions you can also change the number of records involved in their simulations.

This function is provided to all users of In-Memory OLTP Simulator in order to enable them with the capability of changing the workload for standard scenarios in order to experiment with different volumes of data. So, even though the standard scenarios’ definitions cannot be modified, the number of records to be processed can be changed via this special function.

The below screenshot illustrates an example where the number of records for standard scenario 2 is changed from 200K to 100K:

image Figure 6.1: Changing the Number of Records for Standard Scenarios.

As you can see in the next screenshot, the number of records was successfully changed along with the scenario definition that was automatically updated*:

image 
Figure 6.2: Updated Standard Scenario After No. of Records Update.

Now, all is left is to exit Scenario Manager, return to the Simulation Page and execute the updated standard scenario.

* Note: Any changes to the number of records for the standard scenarios is not permanent, it is only valid for your current session in In-Memory OLTP Simulator. The next time you start the program, all standard scenarios will be loaded with their default definitions, including the number of records. Also, note that the special function for changing the number of records is only available for standard scenarios. In the Ultimate Edition of In-Memory OLTP Simulator, if you want to change the number of records or anything else about a custom scenario you can just modify it.

Back to Top back to top

 

7. Configure: Scenario Manager – Custom Scenarios

In the Ultimate Edition of In-Memory OLTP Simulator you have access to premium features of the tool. The most significant feature is the ability to create, modify, export and import custom scenarios via Scenario Manager. Custom scenarios follow the same structure like standard scenarios with the difference that it is you who write their definition and this gives you unlimited power in terms of simulating any workload against SQL Server® In-Memory Optimization.

image 
Figure 7.1: Scenario Manager (Ultimate Edition View) – Creating a Custom Scenario.


In order to create a custom scenario, you will need to complete the following entities (depending on whether you want to enable all three modes of execution or not):

  • Metadata: Basic information about the scenario like description, which execution modes it supports, etc. This is the page where you select which execution modes will be enabled. Disk-Based execution mode is always enabled as it serves as the baseline.
  • Disk-Based Tables: The definition for the disk-based table(s). These are the traditional RDBMS’s tables.
  • Mem-Opt Tables: The definition for the memory-optimized table(s).
  • Mem-Opt Tables for Native SP: The definition for the memory-optimized tables used in the memory-optimized with natively-compiled stored procedure execution mode.
  • Data Population Script: This is the script that prepares the workload for the scenario. The data can be either sample data (like in the case of the standard scenarios) or Production data fetched into the Simulator’s database from within In-Memory OLTP Simulator with the use of Linked Servers or from outside In-Memory OLTP Simulator using SQL Server® technologies like SSIS/Data Export tools.
  • Prep Script: This script can include preparatory SQL statements that run prior to the execution of each new simulation run. The Prep Script’s execution time is not taken into consideration in the benchmarking process as it is out of the scope of In-Memory OLTP Simulator’s philosophy.
  • Disk-Based SP: The stored procedure that contains the scenario’s logic for the Disk-Based execution mode (uses the Disk-Based Tables).
  • Mem-Opt SP: The stored procedure that contains the scenario’s logic for the Memory-Optimized execution mode (uses the Mem-Opt Tables).
  • Natively-Compiled SP: The stored procedure that contains the scenario’s logic for the Memory-Optimized with Natively-Compiled Stored Procedure execution mode (uses the Mem-Opt Tables for Native SP).
  • Execution Script: The calls to the three stored procedures along with the input parameters (where available).
  • Housekeeping Script: Post-execution script for maintenance purposes that runs in the end of each simulation run.

Note that all the above entities are defined with T-SQL code (DDL) and you need to have at least intermediate T-SQL knowledge in order to easily create them. However, in order to further assist the user, Scenario Manager generates T-SQL templates based on basic input from the user and also provides short live tips during the various steps of the custom scenario creation. In a future release of In-Memory OLTP Simulator a designer will be added that will allow the user to set the configuration for the different entities via a GUI.

After you complete the required definitions as above, you can check the syntax. The parser behind this function besides checking the standard disk-based T-SQL syntax, also performs some special checks that have to do with the memory-optimized data structures’ definition as well as some overall checks that relate to the entire philosophy of how as custom scenario should be constructed.

For step-by-step information on how to create a custom scenario from scratch please visit the dedicated help topic “Special Topics: Creating a Custom Scenario from Scratch”.

Back to Top back to top

 

8. Configure: Emergency Thresholds

In-Memory OLTP Simulator allows you to set the following two “emergency thresholds”: one for the maximum allowed memory utilization percentage and another for the maximum allowed CPU utilization percentage. You can set these values using the below dialog:

image 
Figure 8.1: Emergency Thresholds Dialog.


You can launch the Emergency Thresholds dialog with any of the following methods:

  • By navigating to "Configure - Emergency Thresholds"
  • By clicking the following quick access button: image
  • By using the key combination: Ctrl + Shift + E

During a scenario's initialization and execution, the Simulator’s engine constantly checks the memory and CPU utilization of the database server that hosts the Simulator’s database. If one of those metrics reaches the maximum allowed percentage, then the engine of In-Memory OLTP Simulator automatically cancels the scenario's execution and drops the scenario from the Simulator's database. This is done in order to ensure stability of the SQL Server® instance where the In-Memory OLTP Simulator database is hosted on.

The default emergency thresholds are set to 70% for both Memory and CPU utilization.

The allowed value range for both Memory and CPU utilization percentage is: 50% - 90%

Back to Top back to top

 

9. Emergency

As the nature of In-Memory Optimization involves allocating large amounts of RAM for storing multiple versions of the memory-optimized data, this means that you also need to take into consideration possible hardware limitations of your server when it is used for in-memory optimization.

However, as In-Memory OLTP Simulator is an exploratory third-party tool for showcasing the computational benefits of In-Memory Optimization for certain types of workloads, it offers two types of mechanisms for ensuring that you can cope with possible issues in the case where the memory amount that needs to be allocated for a scenario is larger than the available on your database server. These mechanisms are: (i) Emergency Thresholds and (ii) Emergency Actions for manual intervention by the user.

IMAGE 
Figure 9.1: The Emergency Actions Menu.


The available manual emergency actions are:

  • Stop Simulation
  • Drop Scenario
  • Drop All Scenarios
  • Take Simulator Database Offline
  • Bring Simulator Database Online

Back to Top back to top

 

10. Emergency: Stop Simulation

The “Stop Simulation” action in the Emergency menu enables you to stop the simulation that is currently running. Even though the emergency thresholds are being constantly monitored by the engine of In-Memory OLTP Simulator and certain preventive actions are being taken automatically in order to prevent a possible system instability, with the “Stop Simulation” action you can manually stop the execution of the simulation scenario for any reason.

IMAGE
Figure 10.1: The Emergency Actions Menu - Stop Simulation.


You can stop the running simulation with any of the following methods:

  • By navigating to “Emergency – Stop Simulation”
  • By clicking on the following quick access button: image
  • By using the following key combination: Ctrl + Shift + S

Back to Top back to top

 

11. Emergency: Drop Scenario

The “Drop Scenario” action can only be used when a simulation is not in progress. The “Drop Scenario” action drops the scenario’s data structures (i.e. tables, stored procedures, etc.) from the simulator’s database in order to allow the Operating System on the server to claim back resources previously allocated to these data structures.

image 
Figure 11.1: The Emergency Actions Menu - Drop Scenario.

You can drop the selected scenario with any of the following methods:

  • By navigating to “Emergency – Drop Scenario
  • By clicking on the following quick access button:image
  • By using the following key combination: Ctrl + Shift + D

Back to Top back to top

 

12. Emergency: Drop All Scenarios

The “Drop All Scenarios” action can only be used when a simulation is not in progress. This action drops all scenarios’ data structures (i.e. tables, stored procedures, etc.) from the simulator’s database in order to allow the operating system on the server to claim back any resources previously allocated to the data structures of all In-Memory OLTP Simulator’s scenarios that were initialized at least once.

image 
Figure 12.1: The Emergency Actions Menu - Drop All Scenarios.


You can drop all scenarios with any of the following methods:

  • By navigating to “Emergency – Drop All Scenarios
  • By using the following key combination: Ctrl + Shift + O

Back to Top back to top

 

13. Emergency: Take Simulator DB Offline

The “Take Simulator Database Offline” action takes In-Memory OLTP Simulator’s database offline in order to free up any resources which might have been allocated to the data structures of any In-Memory OLTP Simulator’s scenario.

image 
Figure 13.1: The Emergency Actions Menu - Take Simulator DB Offline.


You can take the Simulator database offline with any of the following methods:

  • By navigating to “Emergency – Take Simulator Database Offline
  • By using the following key combination: Ctrl + Shift + F

Note: After this action is performed, in order to be able to use In-Memory OLTP Simulator again you will need to bring back the database online either from within the In-Memory OLTP Simulator or from within SQL Server® Management Studio.

Back to Top back to top

 

14. Emergency: Bring Simulator DB Online

If for any reason you had to take the Simulator’s database offline (i.e. for releasing resources back to the Operating System), with the “Bring Simulator Database online” function you can try bringing it back online, in order to be able to use again the Simulator’s functions.

image 
Figure 14.1: The Emergency Actions Menu - Bring Simulator DB Online.


You can bring the Simulator database online with any of the following methods:

  • By navigating to “Emergency – Bring Simulator Database Online
  • By using the following key combination: Ctrl + Shift + N

Note: If for any reason it is not possible to bring the Simulator’s database back online, you can try bringing it online using SQL Server® Management Studio. If In-Memory OLTP Simulator’s database is successfully brought back online from within Simulator, then the connection to the database is restored as well. In the opposite case, after bringing the database back online from SQL Server® Management Studio, you will need to reconnect to the database via “Configure – Connect to Database”.

Back to Top back to top

 

15. View: Active Scenario Definition

With this function you can have view-only access to the active scenario’s definition. The below screenshot is an example of a standard scenario’s definition:

 

image Figure 15.1: The View Active Scenario Definition Dialog.


You can access the “View Active Scenario Definition” function with any of the following methods:

  • By navigating to “View – Active Scenario Definition
  • By clicking on the following quick access button: image
  • By using the following key combination: Ctrl + Shift + A

In the “Active Scenario Definition” dialog, you are able to view the following information about the active scenario’s definition:

  • Metadata: Some basic information about the scenario like description, which execution modes it supports, etc.
  • Disk-Based Tables: The definition for the disk-based table(s). These are the traditional RDBMS’s tables.
  • Mem-Opt Tables: The definition for the memory-optimized table(s).
  • Mem-Opt Tables for Native SP: The definition for the memory-optimized tables used in the memory-optimized with natively-compiled stored procedure execution mode.
  • Data Population Script: This is the script that prepares the workload for the scenario. The data can be either sample data (like in the case of the standard scenarios) or data from different sources (i.e. from Production SQL Server® Instances) fetched into the Simulator’s database from within In-Memory OLTP Simulator with the use of Linked Servers or from outside Simulator using SQL Server® technologies like SSIS/Data Export tools.
  • Prep Script: This script can include preparatory SQL statements that run prior to the execution of each new simulation run. The Prep Script’s execution time is not taken into consideration in the benchmarking process as it is out of the scope of In-Memory OLTP Simulator’s philosophy.
  • Disk-Based SP: The stored procedure that contains the scenario’s logic for the Disk-Based execution mode (uses the Disk-Based Tables).
  • Mem-Opt SP: The stored procedure that contains the scenario’s logic for the Memory-Optimized execution mode (uses the Mem-Opt Tables).
  • Natively-Compiled SP: The stored procedure that contains the scenario’s logic for the Memory-Optimized with Natively-Compiled Stored Procedure execution mode (uses the Mem-Opt Tables for Native SP).
  • Execution Script: The calls to the three stored procedures along with the input parameters (where available).
  • Housekeeping Script: Post-execution script for maintenance purposes.

Note: Standard Scenario definitions cannot be changed in Scenario Manager. Only the number of records involved in the computation can be changed via Scenario Manager. This functionality is offered as an extra in the Community Edition of In-Memory OLTP Simulator in order to enable the users to run the standard scenarios with heavier/lighter workload. The users of the Ultimate Edition of In-Memory OLTP Simulator also have access to this feature.

Back to Top back to top

 

16. View: Current Resource Usage

In-Memory Optimization in SQL Server® takes full advantage of the available RAM for achieving breakthrough performance. To this end, the need for monitoring the resource utilization on the database server is more necessary as ever. Besides the background resource utilization monitoring that lies in the heart of the engine of In-Memory OLTP Simulator along with use of the Emergency Thresholds for ensuring server stability, In-Memory OLTP Simulator also provides a way of monitoring the current resource utilization in real-time:

imageFigure 16.1: Current Resource Usage Statistics Dialog.


Via the “Current Resource Usage Statistics” dialog you can take the “pulse” of the following resources on the database server onto which In-Memory OLTP Simulator is connected to:

  • CPU
  • Physical Memory (RAM)
  • Running Processes

By right-clicking on each graph you can export it as image. Also, by right-clicking on the dialog, you can export everything as image.

You can launch the “Current Resource Usage Statistics” view using any of the following methods:

  • By navigating to “View – Current Resource usage
  • By clicking on the quick access button: image
  • By using the key combination: Ctrl + Shift + R

Back to Top back to top

 

17. Analytics: Simulation Statistics

The Analytics module of In-Memory OLTP Simulator provides access to Simulation Statistics and an Executive Report*.
                                                              
The Simulation Statistics page provides analytical information about the outcome of a scenario’s execution(s):

image Figure 17.1: Simulation Statistics Page.


In addition to the graphs, the Simulation Statistics page provides the following information for all the enabled execution modes of the scenario that ran:

  • Start Time
  • End Time
  • Elapsed Time
  • Speedup
  • Fastest? (or Slowest)

In the case of multiple runs of the same scenario, the simulation statistics can be filtered based on the following filters (View Run / Comparison Mode):

  • Fastest Run
    • Based on Speedup
    • Based on Execution Time
  • Slowest Run
    • Based on Speedup
    • Based on Execution Time
  • Average Run
    • Only Based on Speedup
  • Most Recent Run
    • Only Based on Speedup

For the Fastest and Slowest “View Run” filters an additional filtering option is available, that is the “Comparison Method”.  The Comparison Method allows you to choose which metric determines a run as the fastest or as the slowest. This metric can either be the speedup or the execution time. The Comparison Method filtering option should be used in the case of multiple runs of the same scenario. In the case you ran the scenario only once, the Comparison Method would not make any difference.

How the Fastest Run is Determined
In the case where the Comparison Method is set to “Speedup”, the fastest run is determined as the one that has the largest speedup of all runs. Then the execution mode with the largest speedup within that run is considered as the winner of the simulation.

In the case where the Comparison Method is set to “Execution Time”, the fastest run is determined as the one that has the shortest execution time of all runs. Then the execution mode with the shortest execution time within that run is considered as the winner of the simulation.

How the Slowest Run is Determined
In the case where the Comparison Method is set to “Speedup”, the slowest run is determined as the one that has the smallest speedup of all runs. Then the execution mode with the smallest speedup within that run is considered as the slowest mode of the simulation.

In the case where the Comparison Method is set to “Execution Time”, the slowest run is determined as the one that has the longest execution time of all runs. Then the execution mode with the longest execution time within that run is considered as the slowest mode of the simulation.

How the Average Run is Determined
The Average Run calculates the ratios (speedups) between Disk-Based, Memory-Optimized and Memory Optimized with Natively Compiled Stored Procedures average execution times. Average execution time for each mode is considered the total elapsed time divided by the number of runs.

The below matrix shows an example of how the average execution times and speedups are calculated:

Mode Run 1 (ms) Run 2 (ms) Run 3 (ms) Average Exec Time Speedup
Disk-Based
(baseline)

1000 1100 900 (1000+1100+900)/3=1000 (1000/1000)=1.00x
Memory-Optimized

800 750 700 (800+750+700)/3=750 (1000/750)=1.33x
Memory-Optimized with Natively-Compiled Stored Procedure

500 450 400 (500+450+400)/3=450 (1000/450)=2.22x

Table 17.1: How the Average Run is Determined.

The information provided in the Simulation Statistics page can be exported using any of the following methods:

  • Print Full Report: Prints the Simulation Statistics page.
  • Export Report as Image: Exports the Simulation Statistics page as image.
  • Print Only Statistics: Sends the statistics (text only) to the printer.
  • Export Statistics to File: Exports the statistics (text only) to a text file.

You can launch the Simulation Statistics dialog with any of the following methods:

  • By navigating to "Analytics – Simulation Statistics"
  • By clicking the following quick access button:
  • By clicking on the hyperlink provided on the Simulation Page after a full simulation circle is completed.
  • By using the key combination: Ctrl + Shift + T

* The Executive Report is only available in the Ultimate Edition of In-Memory OLTP Simulator.

Back to Top back to top

18. Analytics: Executive Report

The Executive Report function is available only in the Ultimate Edition of In-Memory OLTP Simulator and provides an executive summary on the current simulation outcome. Among other, it contains the scenario description as well as high-level statistics focusing mainly on the fastest simulation run. The executive report also enables the user to easily send the executive report’s statistics by mail by launching the default mail client of the computer In-Memory OLTP Simulator runs on.

imageFigure 18.1: Executive Report Dialog.


In the Executive Report Page, you are presented with three actions:

  • Print Report.
  • Export Report as Image.
  • Email the report’s statistics using the default mail client.

You can launch the Executive Report Dialog with any of the following methods:

  • By navigating to "Analytics – Executive Report"
  • By clicking the following quick access button: image
  • By clicking on the hyperlink provided on the Simulation Page after a full simulation circle is completed.
  • By using the key combination: Ctrl + Shift + U

Back to Top back to top

 

19. Special Topics: Available Standard Scenarios

The current release of In In-Memory OLTP Simulator provides six standard scenarios. These are:

  • Import Sample File of 300K Records: This standard scenario imports a sample file of 300K records into a single table in each mode. Its purpose is to simulate how fast a bulk insert can be when using the In-Memory OLTP Engine in SQL Server®. Note: The sample file is auto-generated during the initialization of the scenario and you need to “Run as Administrator” if you are not running In-Memory OLTP Simulator on the database server.
  • Update Single Column in Table of 200K Records: This scenario updates twice a single column in a table of 200K records in each mode. Its purpose is to simulate how fast an update operation can be when using the In-Memory OLTP Engine and Natively Compiled Stored Procedures in SQL Server®.

  • Insert 400K Records with Additional Column: This scenario inserts the 400K records from an existing table into a new table in each mode, along with populating two additional columns with data.
  • Update Multiple Columns in Table of 300K Records: This scenario updates three columns in a table of 300K records in each mode. Its purpose is to simulate how fast a heavy update operation can be when using the In-Memory OLTP Engine and Natively Compiled Stored Procedures in SQL Server®.

  • Multiple Aggregation on Single Table of 200K Records: This scenario is a simulation of multiple aggregations such as COUNT, SUM, MIN and MAX against a table of 200K records. The results are stored in a new table. The original data is retrieved from an original table and is stored in another table on the fly, in each mode, during its transformation with the aggregation functions.

  • Simulation of Sales Statistics Report with Tables of 200K Records: This scenario is a simulation of a simple sales statistics report in each mode. It aggregates data from tables with 200K records and stores it into a new table.

Back to Top back to top

 

20. Special Topics: Creating a Custom Scenario from Scratch

In the Ultimate Edition of In-Memory Simulator you have access to the premium features of the tool. The most significant feature is the ability to create, modify, export and import custom scenarios via Scenario Manager. Custom scenarios follow the same structure like standard scenarios with the difference that it is you who write their definition and this gives you unlimited power in terms of simulating any workload against SQL Server® In-Memory Optimization.

This topic shows by example how you can create a custom scenario from scratch and can be used as a guide for creating custom scenarios in In-Memory OLTP Simulator.

Step 1: Perform Analysis of the Workload
Before we start writing the custom scenario we need to study the workload we want to simulate in In-Memory OLTP Simulator. To this end, in this example, we will focus on a fictional “Production” database named “SalesDB” and we will import the target data into the Simulator’s database via a Linked Server object. The following diagram illustrates the database’s schema:

image 
Figure 20.1: Diagram of Fictional Production Database “SalesDB”.

 

As you can see from the above diagram, the “SalesOrderHeader” table contains 10M records. The logic of the custom scenario is to find the top 20 customers in the last 10 years based on the sales data.

The first thing to do before start working in In-Memory OLTP Simulator is to prepare the main query for your custom scenario. To this end, in SQL Server® SSMS, design and test your query using the existing database. After you end up with the final query it means that besides having the logic itself, you will have the names of the involved tables.

For this example, the query is the following:

SELECT TOP 20 c.name AS CusName, SUM(s.TotalAmount) AS TotalCusSales
FROM dbo.salesOrderHeader s
       INNER JOIN dbo.Customer c ON s.cusID=c.id
WHERE DATEDIFF(yyyy,s.salesDate,GETDATE())<=10
GROUP BY c.name
ORDER BY 2 DESC    
    

Listing 20.1: The Sales Comparative Main Query.

 

The involved tables are:

  • SalesOrderHeader
  • Customer

*Note: In this example we will also load the data for the “Area” table even though this is optional for this sample custom scenario.

For each execution mode you will also need to have one output table:

  • For Disk-Based Mode: d_output
  • For Memory-Optimized Mode: m_output
  • For Memory-Optimized with Natively-Compiled SP: n_output

*Note: In the PrepScript part of the scenario’s definition, you will also need to delete the entries of the above three output tables in order to be ready for the results of a new run. See Step 8 for more info.


Step 2: Launch Scenario Manager
Launch Scenario Manager with any of the following methods:

  • By navigating to “Configure – Scenario Manager
  • By clicking on the following quick access button: image
  • By using the following key combination: Ctrl + Shift + M

Step 3: Click on the “Add” Button – Complete (1) Metadata
After clicking on the “Add” button, you can start filling out the new custom scenario’s metadata:

image 
Figure 20.2: Scenario Manager (Ultimate Edition View) - Add Custom Scenario and Complete Metadata.

 

Step 4: Complete (2) Disk-Based Tables Definition
In this step you will need to create the following disk-based tables:

  • d_area
  • d_customer
  • d_salesOrderHeader
  • d_output*

Note: “d_” stands for disk-based table.

image 
Figure 20.3: New Custom Scenario: Disk-Based Tables.


In this step you need to enter the Base Table Names for the disk-based tables, and if more than one, separated by semicolon (;) and click on the “Set” button. This will generate the basic template T-SQL script which you can further modify in order to define the different table columns based on the table definitions in the original database in terms of number and type of columns. Also, please properly define the output table which will be hosting the computation’s results (this is critical).

*Note: Every execution mode needs to have its output table(s) in order to host the computation’s result set.

The final T-SQL script is also provided in the below listing:

--Disk-Based Table Definition Template
CREATE TABLE [dbo].[d_area]
(
[id] [int] NOT NULL PRIMARY KEY,
[descr] [varchar](100),
[remarks] [varchar](250)
)

--Disk-Based Table Definition Template
CREATE TABLE [dbo].[d_customer]
(
[id] [int] NOT NULL PRIMARY KEY,
[name] [varchar](100),
[mailAddress] [varchar](250),
[areaID] int
)

--Disk-Based Table Definition Template
CREATE TABLE [dbo].[d_salesOrderHeader]
(
[reference] [bigint] NOT NULL PRIMARY KEY,
[cusID] [int],
[salesDate] [date],
[NetAmount] [float],
[VatPerc] [float],
[TotalAmount] [float]
)

--Disk-Based Table Definition Template
CREATE TABLE [dbo].[d_output]
(  
[CusID] [int] NOT NULL PRIMARY KEY,
[CusName] varchar(50) NOT NULL,
[TotalCusSales] [float] NOT NULL
)

Listing 20.2: Disk-Based Table Definitions.


Step 5: Complete (3) Memory-Optimized Tables Definition
In this step you will need to create the following memory-optimized tables:

  • m_area
  • m_customer
  • m_salesOrderHeader
  • m_output*

Note: “m_” stands for memory-optimized table.

*Note: Every execution mode needs to have its output table(s) in order to host the computation’s result set.

image 
Figure 20.4: New Custom Scenario: Memory-Optimized Tables.


In this step you need to enter the Base Table Names for the memory-optimized tables, separated by semicolon (;) as well as the estimated (approximately) number of records for each table separated by semicolon (;) in the “Number of Records” input box. Then by clicking on the “Set” button, Scenario Manager will generate the basic template T-SQL script which you can further modify in order to define the different table columns based on the table definitions in the original database in terms of number and type of columns. Also, please properly define the output table which will be hosting the computation’s results (this is critical).


For creating performant memory-optimized data structures (for the memory-optimized and memory-optimized with natively-compiled stored procedure simulation modes) as you can see in the definition of the tables the bucket_count parameter needs to be set for Hash Indexes. To this end, if you would specify manually this value you need to determine it by yourself. However, In-Memory OLTP Simulator, given the estimated number of records for the memory-optimized tables, calculates the bucket_count value for you. Of course, as the definition of all scenarios is based on T-SQL, you can further tune this value along with anything else in the custom scenario.

The final T-SQL script is also provided in the below listing:

--Memory-Optimized Table Definition
CREATE TABLE [dbo].[m_area]
(
[id] [int] NOT NULL,
[descr] [varchar](100),
[remarks] [varchar](250)

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[id]
)WITH ( BUCKET_COUNT = 3000)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

 

--Memory-Optimized Table Definition
CREATE TABLE [dbo].[m_customer]
(
[id] [int] NOT NULL,
[name] [varchar](100),
[mailAddress] [varchar](250),
[areaID] int

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[id]
)WITH ( BUCKET_COUNT = 150000)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

 

--Memory-Optimized Table Definition
CREATE TABLE [dbo].[m_salesOrderHeader]
(
[reference] [bigint] NOT NULL,
[cusID] [int],
[salesDate] [date],
[NetAmount] [float],
[VatPerc] [float],
[TotalAmount] [float]

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[reference]
)WITH ( BUCKET_COUNT = 30000000)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

 

--Memory-Optimized Table Definition
CREATE TABLE [dbo].[m_output]
(
[CusID] [int] NOT NULL,
[CusName] varchar(50) COLLATE Latin1_General_100_BIN2 NOT NULL,
[TotalCusSales] [float] NOT NULL

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[CusID]
)WITH ( BUCKET_COUNT = 60)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

Listing 20.3: Memory-Optimized Table Definitions.

 

Step 6: Complete (4) Memory-Optimized Tables Definition for Natively-Compiled SP
In this step you will need to create the following memory-optimized tables for use by the natively-compiled stored procedure:

  • n_area
  • n_customer
  • n_salesOrderHeader
  • n_output*

Note:n_” stands for memory-optimized table for natively-compiled SP.

*Note: Every execution mode needs to have its output table(s) in order to host the computation’s result set.

image 
Figure 20.5: New Custom Scenario: Memory-Optimized Tables for Natively-Compiled SP.


In this step you need to enter the Base Table Names for the memory-optimized tables that will be used by the natively-compiled stored procedure, separated by semicolon (;) as well as the estimated (approximately) number of records for each table separated by semicolon (;) in the “Number of Records” input box. Then by clicking on the “Set” button, Scenario Manager will generate the basic template T-SQL script which you can further modify in order to define the different table columns based on the table definitions in the original database in terms of number and type of columns. Also, please properly define the output table which will be hosting the computation’s results (this is critical).

The final T-SQL script is also provided in the below listing:

--Memory-Optimized Table Definition for Natively-Compiled SP
CREATE TABLE [dbo].[n_area]
(
[id] [int] NOT NULL,
[descr] [varchar](100),
[remarks] [varchar](250)

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[id]
)WITH ( BUCKET_COUNT = 3000)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

 

--Memory-Optimized with Native SP Table Definition
CREATE TABLE [dbo].[n_customer]
(
[id] [int] NOT NULL,
[name] [varchar](100),
[mailAddress] [varchar](250),
[areaID] int

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[id]
)WITH ( BUCKET_COUNT = 150000)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

 

--Memory-Optimized with Native SP Table Definition
CREATE TABLE [dbo].[n_salesOrderHeader]
(
[reference] [bigint] NOT NULL,
[cusID] [int],
[salesDate] [date],
[NetAmount] [float],
[VatPerc] [float],
[TotalAmount] [float]

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[reference]
)WITH ( BUCKET_COUNT = 30000000)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

 

--Memory-Optimized with Native SP Table Definition
CREATE TABLE [dbo].[n_output]
(
[CusID] [int] NOT NULL,
[CusName] varchar(50) COLLATE Latin1_General_100_BIN2 NOT NULL,
[TotalCusSales] [float] NOT NULL

INDEX [IX_ID_Key] NONCLUSTERED HASH
(
[CusID]
)WITH ( BUCKET_COUNT = 60)
)WITH ( MEMORY_OPTIMIZED = ON , DURABILITY = SCHEMA_ONLY);

Listing 20.4: Memory-Optimized Table Definitions for Natively-Compiled SP.

 

Step 7: Complete (5) Data Population Script Definition
In this step you will need to write the script that prepares the workload for the scenario. The data can either be sample data (like in the case of the standard scenarios) or data from other sources (i.e. Production data from a DBMS instance) fetched into the Simulator’s database from within In-Memory OLTP Simulator with the use of Linked Servers or from outside In-Memory OLTP Simulator using SQL Server® technologies like SSIS and Data Export/Import tools.

image 
Figure 20.6: New Custom Scenario: Data Population Script.


In order to generate sample data for your scenario, you need to enter the number of records that you want to be processed and click on the “Generate Sample Data” button. Then, Scenario Manager will generate a template script that you further edit in order be fully aligned with the data workload you want to be processed by the three execution (simulation) modes.

If you want to bring in data from another database, Scenario Manager can help you by generating a template as well. To do this you can check the “Generate template for fetching data from other databases” checkbox. This will result on a basic template script for bringing data from another database*. By checking the “Suggest optimization for loading process” checkbox, Scenario Manager will modify the previously generated template script along with suggesting the optimal loading sequence based on your custom scenario’s table definitions*.

* The suggested data population scripts for bringing data from other databases will still need your input as you will need to create a linked server (or use an OPENQUERY command)  in order to connect on the source SQL Server® instance/database and retrieve the data. To this end, the task of setting-up the connection to the source database and the final modification of the data population script for bringing the data to the corresponding In-Memory OLTP Simulator tables is a responsibility of the user. You need to be very careful during this process especially when the source database is a Production database.

The final T-SQL script for this example is also provided in the below listing:

--Clean all tables
TRUNCATE TABLE [dbo].[d_area]
TRUNCATE TABLE [dbo].[d_customer]
TRUNCATE TABLE [dbo].[d_salesOrderHeader]

DELETE [dbo].[m_area]
DELETE [dbo].[m_customer]
DELETE [dbo].[m_salesOrderHeader]

DELETE [dbo].[n_area]
DELETE [dbo].[n_customer]
DELETE [dbo].[n_salesOrderHeader]

------------------------------------------
--Modify this part for retrieving the data
------------------------------------------

--Populate Memory-Optimized Tables
INSERT [dbo].[m_area]
SELECT *
FROM PRODUCTION.SalesDB.dbo.Area

INSERT [dbo].[m_customer]
SELECT *
FROM PRODUCTION.SalesDB.dbo.Customer

INSERT [dbo].[m_salesOrderHeader]
SELECT *
FROM PRODUCTION.SalesDB.dbo.SalesOrderHeader

--Populate Memory-Optimized Tables for Native SP
INSERT INTO [dbo].[n_area]
SELECT *
FROM [dbo].[m_area]

INSERT INTO [dbo].[n_customer]
SELECT *
FROM [dbo].[m_customer]

INSERT INTO [dbo].[n_salesOrderHeader]
SELECT *
FROM [dbo].[m_salesOrderHeader]

--Populate Disk-Based Tables
INSERT INTO [dbo].[d_area]
SELECT *
FROM [dbo].[m_area]

INSERT INTO [dbo].[d_customer]
SELECT *
FROM [dbo].[m_customer]

INSERT INTO [dbo].[d_salesOrderHeader]
SELECT *
FROM [dbo].[m_salesOrderHeader]
------------------------------------------

Listing 20.5: Data Population Script.

 

Note: In the above example, “Production” is the name of a Linked Server connected to another SQL Server instance for bringing in data for running the simulations against it.

Step 8: Complete (6) Prep Script Definition
In this step you will need to write the Prep Script. This script can include preparatory SQL statements that run prior to the execution of each new simulation run. The Prep Script’s execution time is not taken into consideration in the benchmarking process as it is out of the scope of In-Memory OLTP Simulator’s philosophy.

image 
Figure 20.7: New Custom Scenario: Prep Script.


In this example you can see that the Prep Script undertakes the truncation of the three output tables:

  • d_output
  • m_output
  • n_output

This ensures that right before each run of the simulation, the output tables will be emptied and thus become available to host the results of the simulation/data processing.

** It is highly recommended when creating a custom scenario and you use output tables for hosting results, to always truncate (or delete in the case of memory-optimized tables) the output tables in the Prep Script which does not participate to the time estimation process. In the opposite case you will need to do this in the stored procedures thus resulting in including these commands to the time estimation process. If you do not truncate the output tables at all, in the second run of your custom scenario you might get an exception (depending on the definition of your stored procedures). Someone might argue why the truncation of the output tables is not done automatically. The answer is simple: in custom scenarios In-Memory OLTP Simulator does not really know which table is the output one, that’s why after all you need to select the output table in the case you want to view the processing results.

The final T-SQL script is also provided in the below listing:

--For every run clean the output tables
TRUNCATE TABLE dbo.d_output
DELETE dbo.m_output
DELETE dbo.n_output

Listing 20.6: Prep Script.


Note: Memory-optimized tables cannot be truncated. They can only emptied with the DELETE statement.

Step 9: Complete (7) Disk-Based SP Definition
In this step you will need to define the Disk-Based stored procedure. This is the entity that contains the scenario’s logic for the Disk-Based execution mode and uses the Disk-Based Tables.

image 
Figure 20.8: New Custom Scenario: Disk-Based SP Definition.

 

--Disk-Based Stored Procedure for Scenario Execution - Definition
CREATE PROCEDURE [dbo].[d_process]
AS
BEGIN

--Insert query result to the output table
INSERT INTO dbo.d_output  
SELECT TOP 20
c.id AS CusID,
c.name AS CusName,
SUM(s.TotalAmount) AS TotalCusSales
FROM dbo.d_salesOrderHeader s
INNER JOIN dbo.d_Customer c ON s.cusID=c.id
WHERE DATEDIFF(yyyy,s.salesDate,GETDATE())<=10
GROUP BY c.id,c.name
ORDER BY 2 DESC     

END;

Listing 20.7: Disk-Based SP Definition.


As you can see, the stored procedure's results are stored into the output table d_output.

 

Step 10: Complete (8) Memory-Optimized SP Definition
In this step you need to define the Memory-Optimized stored procedure. This is the entity that contains the scenario’s logic for the Memory-Optimized execution mode and uses the Mem-Opt Tables.

image 
Figure 20.9: New Custom Scenario: Memory-Optimized SP Definition.

 

--Memory-Optimized Stored Procedure for Scenario Execution - Definition
CREATE PROCEDURE [dbo].[m_process]
AS
BEGIN

--Insert query result to the output table
INSERT INTO dbo.m_output  
SELECT TOP 20
c.id AS CusID,
c.name AS CusName,
SUM(s.TotalAmount) AS TotalCusSales
FROM dbo.m_salesOrderHeader s
INNER JOIN dbo.m_Customer c ON s.cusID=c.id
WHERE DATEDIFF(yyyy,s.salesDate,GETDATE())<=10
GROUP BY c.id,c.name
ORDER BY 2 DESC     
END;

Listing 20.8: Memory-Optimized SP Definition.

 

As you can see, the stored procedure's results are stored into the output table m_output.

 

Step 11: Complete (9) Natively-Compiled SP Definition
In this step you need to define the Natively-Compiled stored procedure. This is the entity that contains the scenario’s logic for the Memory-Optimized with Natively-Compiled Stored Procedure execution mode and uses the Mem-Opt Tables for Native SP.

image 
Figure 20.10: New Custom Scenario: Natively-Compiled SP Definition.

 

--Natively Compiled Stored Procedure for Scenario Execution - Definition
CREATE PROCEDURE [dbo].[n_process]
WITH NATIVE_COMPILATION, SCHEMABINDING, EXECUTE AS OWNER
AS
BEGIN ATOMIC WITH
(TRANSACTION ISOLATION LEVEL = SNAPSHOT,
LANGUAGE = N'us_english')

--Insert query result to the output table  
INSERT INTO dbo.n_output  
SELECT TOP 20
c.id AS CusID,
c.name COLLATE Latin1_General_100_BIN2 AS CusName,
SUM(s.TotalAmount) AS TotalCusSales
FROM dbo.n_salesOrderHeader s
INNER JOIN dbo.n_Customer c ON s.cusID=c.id
WHERE DATEDIFF(yyyy,s.salesDate,GETDATE())<=10
GROUP BY c.id,c.name COLLATE Latin1_General_100_BIN2
ORDER BY 2 DESC     

END;

Listing 20.9: Natively-Compiled SP Definition.

 

As you can see, the stored procedure's results are stored into the output table n_output.

In the natively-compiled stored procedure you will also notice that the varchar columns are explicitly converted to the Latin1_General_100_BIN2 collation. That’s because in the SQL Server® 2014 release of In-Memory OLTP you cannot use ORDER BY or GROUP BY on index string columns that do not use BIN2 collation as well as indexes can only be created on string columns if they use a BIN2 collation (more info…).

Step 12: Review (10) Execution Script
In this step you need to review the execution scripts for the three stored procedures. Even though the execution scripts are built dynamically, in case you need to use input parameters you need to manually edit them.

image 
Figure 20.11: The Execution Scripts for the Three Stored Procedures.

 

Step 13: Complete (11) Housekeeping
This is the last step in defining a custom scenario and it is optional. The Housekeeping script is nothing more than a series of commands you might want to be executed after each time a simulation run is completed for the specific custom scenario.

mage 
Figure 20.12: The Housekeeping Script.


Step 14: Saving the Custom Scenario
Now all is left is to save the custom scenario. You can either directly click on the “Apply” button or click first on the “Check Syntax” button. In either case, a series of syntax checks will run in order to validate the custom scenario’s definition* prior to writing it on disk.

*Note: You need to take into consideration that even though In-Memory OLTP Simulator runs a series of syntax checks, it does not fully cover all possible syntax errors, especially in the case of the memory-optimized data structure definitions. Nevertheless, if there is a syntax error which was not tracked by the syntax checker, it will raise an error during the scenario’s execution.

image 
Figure 20.13: Syntax Check.

 

image 
Figure 20.14: Scenario Saved.


Now, your newly created custom scenario is saved and ready for execution.

Back to Top back to top


21. Special Topics: Generating and Using Multiple Simulator Databases

In the Ultimate Edition of In-Memory OLTP Simulator you can generate and use multiple Simulator databases.

As mentioned in the “Getting Started” topic, when you first use In-Memory OLTP Simulator you have to connect to a 64-bit SQL Server® 2014 (or later) instance (Enterprise, Developer or Evaluation edition) and generate the Simulator’s database. Then in subsequent uses of the tool you just have to connect to the database.

If you try to run the database generation multiple times in the Community Edition of In-Memory OLTP Simulator, you will get an error message stating that a Simulator database already exists and that you cannot generate another one. However, in the Ultimate Edition if you run the database generation process more than once, then additional Simulator databases will be generated each time.

If you have generated more than one In-Memory OLTP Simulator Databases, then you can select which database to use via the SQL Instance Connection dialog:

image 
Figure 21.1: Select a Simulator Database to Connect to (Ultimate Edition).

Back to Top back to top

 

22. Special Topics: How Simulation Times are Calculated

If you take a stopwatch and use it during a simulation run in In-Memory OLTP Simulator you will notice that the times do not exactly match when compared to the times the Simulator calculates. However, this is natural as the way In-Memory OLTP Simulator calculates the simulation time for each execution mode is very specific in order to be as accurate as possible. This document explains the computation.

First of all, as you can see in the Simulation Page (see below screenshot), the scenario initialization step is a separate task. The initialization process creates the required data structures as instructed by the selected scenario’s definition,  prepares the data to be used and also does some other housekeeping tasks for preparing the scenario for execution. As these tasks are global for all execution modes, their execution time is irrelevant as In-Memory OLTP Simulator targets at comparing the pure execution times of each mode only.

imageFigure 22.1: Simulation Page.


Another important aspect that needs to be taken into consideration is caching/buffering. Every time a scenario mode is executed, In-Memory OLTP Simulator cleans the procedure and data cache in order for SQL statements not to be reused from the cache neither data to be fetched from the data cache. Even though this task runs in the background during each scenario mode’s execution, it is not taken into consideration in the mode’s execution time as it is a task that is used for preparing the modes for the most accurate possible execution time measurement. The cleaning of buffers is considered as a built-in process and you do not need to take any additional actions. Someone might argue that it could be more realistic not to clean the cache. However, as the type of workload patterns that may benefit most by using In-Memory Optimization are fast-changing data environments with large volumes of data, it was considered that it would be more realistic to clean the cache each time a mode of a scenario is executed in order to measure the pure computation time for each mode.

The following diagram illustrates all the above:

image Figure 22.2: How Execution Time is Calculated for Each Mode.


Note: If you execute the scenario in SQL Server® Management Studio (SSMS) you might notice a slightly better performance for all of its modes of execution. This could happen because In-Memory OLTP Simulator is a .NET application which uses certain drivers for connecting to a SQL Server® Database Engine and this might introduce some overheads even though all steps have been taken to minimize any overhead that the abovementioned case may cause.

Back to Top back to top

 

23. Special Topics: Activating the Ultimate Edition

Activating the Ultimate Edition of In-Memory OLTP Simulator is a simple process. After you purchase a license for the Ultimate Edition, the next step is to proceed with the activation. From within In-Memory OLTP Simulator, under the “Help” menu, you click on the “Activate Ultimate Edition” item:

image 
Figure 23.1: The “Activate Ultimate Edition” Option.


You will then be presented with the Ultimate Edition activation dialog:

image 
Figure 23.2: The Ultimate Edition Activation Dialog.


In Step 1 of the activation process you need to enter the serial key that you were provided during your purchase of the Ultimate Edition license of In-Memory OLTP Simulator.

Next, by clicking on the “Get Activation Code” button you can proceed to Step 2 where the activation dialog will generate a unique hardware hash (a hash is a numeric value derived through a mathematical formula and based upon some other, original value) that is created by running 2 different pieces of information from the PC's hardware components (BIOS id, Mainboard id) through a one-way mathematical transformation. This means that the resultant hash value cannot be backwards calculated to determine the original values and it ensures complete anonymity. This process was developed in order to maintain the user's privacy and it is only used to deter piracy.

image 
Figure 23.3: After Activation Steps 1 and 2 are Completed.

 

After steps 1 and 2 are completed, you can proceed to Step 3 which is requesting the License File. In this step you click on the “Request License File” button and you are presented with a message that needs to be sent to the In-Memory OLTP Simulator Activation Service:

image 
Figure 23.4: The Activation Request.


You can send the generated license request either by copying the serial key and activation code and sending it by email to the provided email address or by clicking on the “Launch Email Client” button and sending the dynamically created email (please use the same email address used for purchasing the license – this is done for security purposes) through your default email client.

After completing the license file request, if the submitted information is successfully validated, you will receive by email the license file which you can upload into In-Memory OLTP Simulator by clicking on the “Load License File”. After this action, the Ultimate Edition of In-Memory OLTP Simulator will be activated.

image 
Figure 23.5: Successful Activation of the Ultimate Edition.

 

* Important Note: With a single license purchase of In-Memory OLTP Simulator’s Ultimate Edition you are allowed up to two (2) activations of the product for different hardware configurations. The second activation should be used in case your computer hardware was changed (i.e. mainboard) or even in the case you replaced your computer with a new one. In order to perform more than 2 activations for different hardware configurations you will need to purchase additional licenses.

Back to Top back to top


.