:::: MENU ::::

Posts Categorized / dbDigger

  • Dec 03 / 2013
  • 0
Constraints and Keys, Data Modeling and Database Design, dbDigger, DDL, SQL Server Error messages

The ALTER TABLE statement conflicted with the FOREIGN KEY constraint

Today i was working to create foreign keys on few tables. Some of ADD CONSTRAINT statements got failed by generating the following error message.

The ALTER TABLE statement conflicted with the FOREIGN KEY constraint “%”. The conflict occurred in database “%”, table “%”, column ‘%’.

If we look closely at the descriptive message it points to the same foreign key constraint that we are just trying to create. Reason is that while creating the constraint SQL Server is trying to validate the existing data based on new constraint. There are some records with no reference in their base primary key table. Such records are cause of error here.

SOLUTION

we have couple of options. If you want to make sure that no such orphaned records be there in subject table you may just detect and delete them. Then ADD CONSTRAINT statement will work without any error. However deleting the data will hardly be a suitable option. Alternate is to use the ADD CONSTRAINT statement with NOCHECK option. It will make sure that existing data is not validated for constraint at time of creation.

WITH CHECK | WITH NOCHECK

Here is some informative text on these options from BOL.

Specifies whether the data in the table is or is not validated against a newly added or re-enabled FOREIGN KEY or CHECK constraint. If not specified, WITH CHECK is assumed for new constraints, and WITH NOCHECK is assumed for re-enabled constraints.
If you do not want to verify new CHECK or FOREIGN KEY constraints against existing data, use WITH NOCHECK. Any constraint violations that are suppressed by WITH NOCHECK when the constraint is added may cause future updates to fail if they update rows with data that does not comply with the constraint.

QUERY OPTIMIZER BEHAVIOR

The query optimizer does not consider constraints that are defined WITH NOCHECK. Such constraints are ignored until they are re-enabled by using ALTER TABLE table WITH CHECK CHECK CONSTRAINT ALL.

  • Oct 11 / 2013
  • 0
Backup and Restore, Consultancy, dbDigger, T-SQL Tips and Tricks

Create directories through T-SQL

In a recent task i was connected to SQL Server through SSMS and had no OS access. To configure the automatic backups i was required to create couple of directories on the disk. So without having OS access i had to create directories by using T-SQL. Following is the command that i used to create ‘D:DBBackupsDBDigger’

EXEC master.dbo.xp_create_subdir 'D:DBBackups_DBDIgger'
GO
  • Oct 04 / 2013
  • 0
DBCC Commands, dbDigger, Monitoring and Analysis, Transaction log files

Get total size of log file along with used space percentage for each SQL Server database

Always there exists a single log file for each SQL Server database. Log file keeps growing when in full or bulk logged recovery model. DBAs implement strategies to manage the log file size. There are two aspects of log file size.

  • One is log file size that it captures on disk and is shown at OS level.
  • Second aspect relates to filled part of over all log file.

Suppose there is a log file with size 20 GB on disk then it is quite possible that it is filled with only 1 GB log and rest of 19 GB is empty. SQL Server will keep populating the remaining part unless it is full and then will claim more size according to file growth settings.
It is a basic requirement to have an idea of log file and its used space. Following is a very simple way to get this information. You may also insert the information into a table on daily or weekly basis and may process alerts for these.

DBCC SQLPERF (LOGSPACE)
GO

Get total size of log file along with used space percentage for each SQL Server database

As a result we have Database name, total log file size as well as the filled log space percentage. Last column status is always zero.

  • Oct 03 / 2013
  • 0
dbDigger, Monitoring and Analysis, Reporting Services SSRS, T-SQL Scripts

Scripts for SSRS reports usage analysis

Here are some handy and useful scripts that may be used to analyze the report usage of your SSRS report server. All these are select scripts and are using NOLOCK hint. So feel free to add filters and customize according to your specific requirements.
To get the last Time a Report was generated we have following script. Further filters may be applied if required.

 SELECT DISTINCT C.NAME AS [Report Name]  
   ,MAX(EL.TIMESTART) AS [LAST START TIME]  
 FROM REPORTSERVER.DBO.EXECUTIONLOG(NOLOCK) EL  
 INNER JOIN REPORTSERVER.DBO.CATALOG (NOLOCK) C ON EL.REPORTID = C.ITEMID  
 GROUP BY C.NAME  
 ORDER BY C.NAME  

If you want to analyze that how many reports were viewed on a specific day then we have following script. It generates report usage Stats by Date. It is advisable to insert the name of the account that runs SSRS. The reason is that if you have a report that is run off of cache or has subscription, the counts will show here. So if you want to see reports that have been run by users only then filter out the account that runs the SSRS service.

SELECT CONVERT(VARCHAR(25), TIMESTART, 101) AS [SPECIFIC DATE]  
   ,COUNT(*) AS [Gneerated Number]  
 FROM REPORTSERVER.DBO.EXECUTIONLOG(NOLOCK)  
 --WHERE USERNAME NOT IN  
 GROUP BY CONVERT(VARCHAR(25), TIMESTART, 101)  
 ORDER BY CONVERT(VARCHAR(25), TIMESTART, 101) DESC  

Get the report server usage during the hours.

SELECT DATEPART(HOUR, TIMESTART) AS HOUR  
   ,COUNT(*) AS [Gneerated Number]  
 FROM REPORTSERVER.DBO.EXECUTIONLOG(NOLOCK)  
 --WHERE USERNAME NOT IN  
 GROUP BY DATEPART(HOUR, TIMESTART)  
 ORDER BY DATEPART(HOUR, TIMESTART)  

To get the reports generated by users we have following script.

SELECT EL.USERNAME  
   ,C.NAME  
   ,COUNT(1) AS [Gneerated Number]  
 FROM REPORTSERVER.DBO.EXECUTIONLOG(NOLOCK) EL  
 INNER JOIN REPORTSERVER.DBO.CATALOG (NOLOCK) C ON EL.REPORTID = C.ITEMID  
 GROUP BY EL.USERNAME  
   ,C.NAME  
 ORDER BY EL.USERNAME  
   ,C.NAME  
  • Oct 03 / 2013
  • 0
Learn from the community behind Microsoft products
dbDigger, SQL Server Training

Learn from the community behind Microsoft products

We all are involved with Microsoft products and MS has shown wonderful improvement in all its products. Obviously a very dedicated teamwork is behind all this improvement. And what if you get a chance to learn from those skillful and handwork resources who are part of products success. It will be a great deal of course. So here is a link that will lead you to lot of quality and focused videos on various Microsoft products and technologies. Watch online or download the short videos.

Pages:1234567...84
Consult us to explore the Databases. Contact us