How can I export Query Store data?Never ending Query Store searchHow can I make sure the SQL Server query optimizer uses the exact tables in the queryAdvice needed for Querying DateTimeOffset data by end usersDoes SQL Server Query Store capture parameter values?Enable query store for all databasesWhat to consider when setting flush_interval_seconds in SQL Query Store?Never ending Query Store searchQuery store works in dev, but not production in Amazon RDSSQL Server database design for fields of different data types - single table with multiple columns or multiple tables for each data type?SQL Query Store qds_async_queueSQL Server 2016 & 2017 Force Plan failing with reason 8695

How to educate team mate to take screenshots for bugs with out unwanted stuff

Can I challenge the interviewer to give me a proper technical feedback?

Why aren't there more Gauls like Obelix?

An Undercover Army

Should I file my taxes? No income, unemployed, but paid 2k in student loan interest

What is the best index strategy or query SELECT when performing a search/lookup BETWEEN IP address (IPv4 and IPv6) ranges?

Averaging over columns while ignoring zero entries

How does a sound wave propagate?

Use Mercury as quenching liquid for swords?

Do I need a return ticket to Canada if I'm a Japanese National?

What is Tony Stark injecting into himself in Iron Man 3?

What the error in writing this equation by latex?

How can I have x-axis ticks that show ticks scaled in powers of ten?

Does an unused member variable take up memory?

Precision notation for voltmeters

After Brexit, will the EU recognize British passports that are valid for more than ten years?

Why do we call complex numbers “numbers” but we don’t consider 2-vectors numbers?

What is the purpose of a disclaimer like "this is not legal advice"?

What would be the most expensive material to an intergalactic society?

Unidentified signals on FT8 frequencies

Trigger on Custom Object Share

Was it really inappropriate to write a pull request for the company I interviewed with?

Draw this image in the TIKZ package

Why would /etc/passwd be used every time someone executes `ls -l` command?



How can I export Query Store data?


Never ending Query Store searchHow can I make sure the SQL Server query optimizer uses the exact tables in the queryAdvice needed for Querying DateTimeOffset data by end usersDoes SQL Server Query Store capture parameter values?Enable query store for all databasesWhat to consider when setting flush_interval_seconds in SQL Query Store?Never ending Query Store searchQuery store works in dev, but not production in Amazon RDSSQL Server database design for fields of different data types - single table with multiple columns or multiple tables for each data type?SQL Query Store qds_async_queueSQL Server 2016 & 2017 Force Plan failing with reason 8695













4















I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



Related



  • Best Practice with the Query Store

  • How Query Store Collects Data

  • Query Store Catalog Views

  • Query Store Stored Procedures

  • Monitoring performance by using the Query Store

Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



Edit after first answer and edits to same
Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)










share|improve this question




























    4















    I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



    While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



    The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



    I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




    At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




    Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



    Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



    Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



    Related



    • Best Practice with the Query Store

    • How Query Store Collects Data

    • Query Store Catalog Views

    • Query Store Stored Procedures

    • Monitoring performance by using the Query Store

    Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



    Edit after first answer and edits to same
    Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)










    share|improve this question


























      4












      4








      4








      I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



      While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



      The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



      I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




      At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




      Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



      Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



      Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



      Related



      • Best Practice with the Query Store

      • How Query Store Collects Data

      • Query Store Catalog Views

      • Query Store Stored Procedures

      • Monitoring performance by using the Query Store

      Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



      Edit after first answer and edits to same
      Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)










      share|improve this question
















      I have Query Store (QS) running on a SQL 2017 instance. Currently at RTM, with RTM CU13 currently in test, to be applied on prod in next month's patch window.



      While most queries and reports return results quickly with little to no impact, anything thing I try to look at around waits is problematic. CPU usage climbs from 20 to 80 percent and stays there for minutes until I kill it. This is 24/7 production system so if I really want to look at QS waits I am going to need to do it someplace else.



      The database is 150GB with 1000MB of space for QS. I have a sandbox with 10GB of space so if I could get the QS data out, I could play with it there.



      I looked around and I am not finding how to do this. The best I found was is this sql.sasquatch 2016 post with a 2016 answer by Erin Stellato




      At present, there is no option to export and/or import Query Store data, but there is a Connect item to vote for: https://connect.microsoft.com/SQLServer/feedback/details/2620017/export-query-store-tables-separately-from-the-database-tables




      Note: The link goes to a redirect "Microsoft Connect Has Been Retired" Looks like the actual link should be to https://feedback.azure.com/forums/908035-sql-server/suggestions/32901670-export-query-store-tables-separately-from-the-data



      Looking at Microsoft, I find that most everything you might use to access the data is a View, Stored Procedure or Report. I am not seeing a way to just extract all the QS stuff out of database.



      Examples of direct queries, use the views Example by Kendra Little I have toyed with the idea of just doing a Select * from the views and exporting the results to my sandbox. But as I have not found anyone talking about it, I am not sure it is good idea.



      Related



      • Best Practice with the Query Store

      • How Query Store Collects Data

      • Query Store Catalog Views

      • Query Store Stored Procedures

      • Monitoring performance by using the Query Store

      Additionally I would like to be able to keep the Query Store results from pre-CU13 to use as a baseline to compare post CU13.



      Edit after first answer and edits to same
      Recent edits to the answer by jadarnel27 adds good info, but I don't care about the user interface, I want to be able to query the data without altering the database or impacting performance. As a secondary goal I would like to be able to archive the QS data, so I can review previous performance (i.e. before and upgrade, but after the old QS data would have purged off)







      sql-server sql-server-2017 query-store






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited 8 hours ago







      James Jenkins

















      asked 13 hours ago









      James JenkinsJames Jenkins

      1,80621941




      1,80621941




















          1 Answer
          1






          active

          oldest

          votes


















          4














          First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



          Never ending Query Store search



          Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



          The basic approach would be like this:



          SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
          SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
          SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
          SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
          SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
          SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


          The nice thing about this approach is that:



          • you'll only get the data you need (1000 MB)

          • you can add indexes to support your reporting queries, because these are actual tables

          • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)

            • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs


          • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import

          The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.




          Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



          If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



          Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



          USE [master];
          GO
          CREATE DATABASE [Sandbox];
          GO

          USE [YourSourceDatabaseWithTheQueryStoreInfo];
          GO

          BEGIN TRANSACTION;

          INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
          INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
          INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
          INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
          INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
          INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

          COMMIT TRANSACTION;
          GO

          USE [master];
          GO
          ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


          Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



          After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



          It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




          Error: 12434, Severity: 20, State: 56.

          The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




          query store in the "error" state



          I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




          Error: 5571, Severity: 16, State: 2.

          Internal FILESTREAM error: failed to access the garbage collection table.




          You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






          share|improve this answer
























            Your Answer








            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "182"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );













            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f231650%2fhow-can-i-export-query-store-data%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            4














            First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



            Never ending Query Store search



            Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



            The basic approach would be like this:



            SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
            SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
            SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
            SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
            SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
            SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


            The nice thing about this approach is that:



            • you'll only get the data you need (1000 MB)

            • you can add indexes to support your reporting queries, because these are actual tables

            • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)

              • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs


            • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import

            The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.




            Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



            If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



            Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



            USE [master];
            GO
            CREATE DATABASE [Sandbox];
            GO

            USE [YourSourceDatabaseWithTheQueryStoreInfo];
            GO

            BEGIN TRANSACTION;

            INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
            INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
            INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
            INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
            INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
            INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

            COMMIT TRANSACTION;
            GO

            USE [master];
            GO
            ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


            Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



            After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



            It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




            Error: 12434, Severity: 20, State: 56.

            The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




            query store in the "error" state



            I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




            Error: 5571, Severity: 16, State: 2.

            Internal FILESTREAM error: failed to access the garbage collection table.




            You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






            share|improve this answer





























              4














              First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



              Never ending Query Store search



              Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



              The basic approach would be like this:



              SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
              SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
              SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
              SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
              SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
              SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


              The nice thing about this approach is that:



              • you'll only get the data you need (1000 MB)

              • you can add indexes to support your reporting queries, because these are actual tables

              • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)

                • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs


              • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import

              The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.




              Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



              If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



              Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



              USE [master];
              GO
              CREATE DATABASE [Sandbox];
              GO

              USE [YourSourceDatabaseWithTheQueryStoreInfo];
              GO

              BEGIN TRANSACTION;

              INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
              INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
              INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
              INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
              INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
              INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

              COMMIT TRANSACTION;
              GO

              USE [master];
              GO
              ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


              Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



              After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



              It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




              Error: 12434, Severity: 20, State: 56.

              The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




              query store in the "error" state



              I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




              Error: 5571, Severity: 16, State: 2.

              Internal FILESTREAM error: failed to access the garbage collection table.




              You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






              share|improve this answer



























                4












                4








                4







                First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



                Never ending Query Store search



                Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



                The basic approach would be like this:



                SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
                SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
                SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
                SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
                SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
                SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


                The nice thing about this approach is that:



                • you'll only get the data you need (1000 MB)

                • you can add indexes to support your reporting queries, because these are actual tables

                • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)

                  • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs


                • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import

                The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.




                Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



                If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



                Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



                USE [master];
                GO
                CREATE DATABASE [Sandbox];
                GO

                USE [YourSourceDatabaseWithTheQueryStoreInfo];
                GO

                BEGIN TRANSACTION;

                INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
                INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
                INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
                INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
                INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
                INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

                COMMIT TRANSACTION;
                GO

                USE [master];
                GO
                ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


                Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



                After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



                It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




                Error: 12434, Severity: 20, State: 56.

                The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




                query store in the "error" state



                I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




                Error: 5571, Severity: 16, State: 2.

                Internal FILESTREAM error: failed to access the garbage collection table.




                You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.






                share|improve this answer















                First of all, you might be able to get acceptable performance with queries directly against the query store catalog views by updating stats, adding query hints with plan guides, or changing the database compatibility level / CE. See the answers from Forrest and Marian here:



                Never ending Query Store search



                Otherwise, for exporting, one approach would be a simple SELECT...INTO from the query store views to the "sandbox" database. These are the relevant views.



                The basic approach would be like this:



                SELECT * INTO Sandbox.dbo.query_store_runtime_stats FROM sys.query_store_runtime_stats;
                SELECT * INTO Sandbox.dbo.query_store_runtime_stats_interval FROM sys.query_store_runtime_stats_interval;
                SELECT * INTO Sandbox.dbo.query_store_plan FROM sys.query_store_plan;
                SELECT * INTO Sandbox.dbo.query_store_query FROM sys.query_store_query;
                SELECT * INTO Sandbox.dbo.query_store_query_text FROM sys.query_store_query_text;
                SELECT * INTO Sandbox.dbo.query_store_wait_stats FROM sys.query_store_wait_stats;


                The nice thing about this approach is that:



                • you'll only get the data you need (1000 MB)

                • you can add indexes to support your reporting queries, because these are actual tables

                • they won't have the unusual memory scanning behavior that leads to poor performance against the actual views (again because they are actual tables)

                  • Note: the SELECT...INTO queries shouldn't drive up CPU like the built-in query store reporting queries, because they won't have the problematic joins that cause repeated access to the in-memory TVFs


                • you can keep different versions of the data (for different CU levels, etc) by changing the table names, or adding a column to the tables that indicates the data and / or version of SQL Server that was used for that import

                The "con" of this approach is that you can't use the query store user interface. A workaround for that would be to use profiler or extended events to capture the queries being executed by the user interface for the specific reports you need. You could even do this capture in a non-prod environment, as the queries should be the same.




                Warning: this is potentially a really bad idea. There's a reason you can't normally write to these tables. Special thanks to Forrest for mentioning the possibility to me.



                If you really want to be able to use the user interface, you can actually load the base Query Store tables with data while connecting via the DAC. Here's what worked for me.



                Reminder: you have to be using a DAC connection to do this, otherwise you'll get errors related to the sys.plan_persist_* tables not existing



                USE [master];
                GO
                CREATE DATABASE [Sandbox];
                GO

                USE [YourSourceDatabaseWithTheQueryStoreInfo];
                GO

                BEGIN TRANSACTION;

                INSERT INTO Sandbox.sys.plan_persist_runtime_stats SELECT * FROM sys.plan_persist_runtime_stats;
                INSERT INTO Sandbox.sys.plan_persist_runtime_stats_interval SELECT * FROM sys.plan_persist_runtime_stats_interval;
                INSERT INTO Sandbox.sys.plan_persist_plan SELECT * FROM sys.plan_persist_plan;
                INSERT INTO Sandbox.sys.plan_persist_query SELECT * FROM sys.plan_persist_query;
                INSERT INTO Sandbox.sys.plan_persist_query_text SELECT * FROM sys.plan_persist_query_text;
                INSERT INTO Sandbox.sys.plan_persist_wait_stats SELECT * FROM sys.plan_persist_wait_stats;

                COMMIT TRANSACTION;
                GO

                USE [master];
                GO
                ALTER DATABASE [Sandbox] SET QUERY_STORE = ON (OPERATION_MODE = READ_ONLY);


                Note: if you're on SQL Server 2016, you'll need to remove the line about wait stats - that catalog view wasn't added until SQL Server 2017



                After doing that, I was able to use the Query Store UI in SSMS to view info on the queries from the source database. Neat!



                It's important to load the data into the Sandbox database with Query Store off, and then turn Query Store on in read only mode. Otherwise QS ended up in an error state, and this was written to the SQL Server error log:




                Error: 12434, Severity: 20, State: 56.

                The Query Store in database Sandbox is invalid, possibly due to schema or catalog inconsistency.




                query store in the "error" state



                I also noticed that this doesn't work if there are in-memory OLTP (Hekaton) tables in the source database. No matter what I do, Query Store ends up in the "Error" state with this message in the error log:




                Error: 5571, Severity: 16, State: 2.

                Internal FILESTREAM error: failed to access the garbage collection table.




                You may be able to work around that by adding a memory-optimized filegroup to the Sandbox database, I haven't tried that yet.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited 10 hours ago

























                answered 12 hours ago









                jadarnel27jadarnel27

                6,26012038




                6,26012038



























                    draft saved

                    draft discarded
















































                    Thanks for contributing an answer to Database Administrators Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f231650%2fhow-can-i-export-query-store-data%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Möglingen Índice Localización Historia Demografía Referencias Enlaces externos Menú de navegación48°53′18″N 9°07′45″E / 48.888333333333, 9.129166666666748°53′18″N 9°07′45″E / 48.888333333333, 9.1291666666667Sitio web oficial Mapa de Möglingen«Gemeinden in Deutschland nach Fläche, Bevölkerung und Postleitzahl am 30.09.2016»Möglingen

                    Virtualbox - Configuration error: Querying “UUID” failed (VERR_CFGM_VALUE_NOT_FOUND)“VERR_SUPLIB_WORLD_WRITABLE” error when trying to installing OS in virtualboxVirtual Box Kernel errorFailed to open a seesion for the virtual machineFailed to open a session for the virtual machineUbuntu 14.04 LTS Virtualbox errorcan't use VM VirtualBoxusing virtualboxI can't run Linux-64 Bit on VirtualBoxUnable to insert the virtual optical disk (VBoxguestaddition) in virtual machine for ubuntu server in win 10VirtuaBox in Ubuntu 18.04 Issues with Win10.ISO Installation

                    Antonio De Lisio Carrera Referencias Menú de navegación«Caracas: evolución relacional multipleja»«Cuando los gobiernos subestiman a las localidades: L a Iniciativa para la Integración de la Infraestructura Regional Suramericana (IIRSA) en la frontera Colombo-Venezolana»«Maestría en Planificación Integral del Ambiente»«La Metrópoli Caraqueña: Expansión Simplificadora o Articulación Diversificante»«La Metrópoli Caraqueña: Expansión Simplificadora o Articulación Diversificante»«Conózcanos»«Caracas: evolución relacional multipleja»«La Metrópoli Caraqueña: Expansión Simplificadora o Articulación Diversificante»