Condividire...

Condividere ha il suo costo, ma alla fine finisce per arricchire tutti


Sharing has its cost, but at the end leads everyone to a better knowledge

venerdì 18 gennaio 2019

SQL Server Script to move Database Files

Please consider that the below script uses the xp_cmdshell to issue  the files move commands to the underlying OS.

It isn't default enabled on SQL Server Instances.
Be sure to enable this feature before running the script.
You can do it issuing these commands:

EXEC sp_configure 'show advanced options', 1; 
GO 
-- To update the currently configured value for advanced options. 
RECONFIGURE; 
GO 
-- To enable the feature. 
EXEC sp_configure 'xp_cmdshell', 1; 
GO 
-- To update the currently configured value for this feature. 
RECONFIGURE; 
GO 

This script accepts 2 parameters:
@Dbname - Data Base Name to move
@UseDFDir - Use the SQL Server Instance Defaults paths  0 = No, 1 = Yes

If you are not willing to use the  defaults be sure to set the correct paths in the script after the header:

declare @new_data_file_path nvarchar(300) = LOWER('f\SQLDati')
declare @new_log_file_path nvarchar(300) = LOWER('l:\SQLLogs')

You can check/set the default DB paths in the SQL Server properties



The script creates the  dbo.movedbfiles StorePorcedure you can use launch it like this:

exec dbo.movedbfiles 'logtest', 1


-------------------------  Script Start ------------------------------

create PROCEDURE dbo.MoveDBFiles @DBname varchar(300), @UseDFDir int = 0
 
AS 
set nocount on;
/*
Title: Move Database Files
Created:  18-01-2019
By: Ruggiero Lauria

Steps: 1,change file location, 2, take db offline, 3, copy files, 4,take database back online

Input: Change the values for the following variables
@Dbname - Data Base Name to move
@UseDFDir - Use the SQL Server Instance Defaults paths  0 = No, 1 = Yes

Example: exec dbo.MoveDBFiles 'Db1' , 1

Notes:
Ensure that paths are set correctly prior to running scripts
Ensure you have enabled the xp_cmdshell on your instance


The script is provided as it is and for free, use it at your own risk.
Feel free to modify and distribute it

Credits:
Thanks to Christopher Provolt for the initial script template I have used to create mine:
https://cprovolt.wordpress.com/2014/09/10/move-database-files-script/

*/

/* Set your desired Directory if you ar not using the SQL Server Instance Defaults paths  @UseDFDir = 0 */
declare @new_data_file_path nvarchar(300) = LOWER('f\SQLDati')
declare @new_log_file_path nvarchar(300) = LOWER('l:\SQLLogs')

/* Temp Table to hold script messages */

declare @message table
(
msg_num int,
message_text nvarchar(300)
)

/*Check if the DB exists and is not a System Database */
if (select database_id from sys.databases where name= @dbname) < 5 or (select count(*) from sys.databases where name= @dbname) = 0

insert into @message values((select count(*) from @message) + 1, 'Please check you DB Name, consider this procedure does not process System Databases')

else
 begin
 insert into @message values((select count(*) from @message) + 1, 'Processing Database: ' + @DBname )

/* Load the Instance default paths settings */
if (@UseDFDir =1)
begin
set @new_data_file_path = (select left(cast(serverproperty('InstanceDefaultDataPath') as varchar(100)), len(cast(serverproperty('InstanceDefaultDataPath') as nvarchar(100)) )   -1))
set @new_log_file_path = (select left(cast(serverproperty('InstanceDefaultLogPath') as varchar(100)), len(cast(serverproperty('InstanceDefaultLogPath') as nvarchar(100)) )   -1))
end


/* Dynamic table to hold information on database files and their locations
we will use this data to create the required move scripts
*/

declare @info table
(
id int primary key identity,
database_id int,
dbname varchar (50),
virtual_file_name varchar(50),
type_desc varchar(10),
db_file_name varchar(50),
file_path varchar(300)
)

/* Dynamic table to hold all the scripting lines created.

Steps: 1,change file location, 2, take db offline, 3, copy files, 4,database online
*/

declare @scripts table
(
step int,
scriptText varchar(300),
commtype char(4)
)

/* Dynamic table to store the changes that are to be made.
This data can be pasted into an email to inform customer of the changes
*/

/* Variable Decleration Area */
declare @id int; --id is created on temp table to use in cursor for processing data
declare @old_file_location varchar(1000) --temporarily stores the old file location for comparisons
declare @tmpString varchar(1000) -- used to temporarily hold strings
declare @temp_dbname varchar(100) -- holds temp database name
declare @temp_virtual_file_name varchar(100) -- holds temp virtual file name
declare @temp_file_name varchar(100) -- holds temp file name
declare @temp_old_file_location varchar(1000) --holds old location of the file
declare @new_file_path varchar(1000) --holds new location of the file
declare @num int = 0 -- Internal row counter
declare @numrows int -- Total number of files to move for the specified DB
declare @command nvarchar(1000) -- used to temporarily hold command to launch
declare @commtype char(4) -- used to temporarily hold command type to launch

/* get required data and place into @info dynamic table */
insert into @info
select mf.database_id,
db.name as 'dbname',
mf.name as 'virtual_file_name',
mf.type_desc,
REVERSE(SUBSTRING(REVERSE(mf.physical_name), 0, CHARINDEX('\', REVERSE(mf.physical_name)))) as 'db_file_name',
LOWER(SUBSTRING(mf.physical_name, 0, LEN(mf.physical_name) - CHARINDEX('\', REVERSE(mf.physical_name)) + 1)) as 'file_path'
from sys.master_files mf
INNER JOIN sys.databases db ON mf.database_id = db.database_id
where mf.database_id IN
(Select database_id from sys.databases
where sys.databases.name = @dbname)

/* Create the scripted commands for the files */

/* Count the number of files to move */
set @numrows = (select count(*) from @info )

/* Create a cursor on @info table  */
declare c cursor for select id from @info
open c
fetch next from c into @id

while @@fetch_Status=0 Begin

/* Assign required values into variables from @info db */
select @old_file_location = i.file_path,
@temp_virtual_file_name = i.virtual_file_name,
@new_file_path = case i.type_desc when 'ROWS' then @new_data_file_path  when 'LOG' then @new_log_file_path else '' end,
@temp_file_name = i.db_file_name,
@temp_dbname = i.dbname,
@temp_old_file_location = i.file_path
from @info i
where id = @id

/* check if we need to move the files */
if (@old_file_location <> @new_file_path and @new_file_path <>'')

begin
 -- select @old_file_location,  @new_file_path
/* if we have to move files generate the scripts */

set @num = @num +1
select @num as num, @numrows as numrows
set @tmpString = 'ALTER DATABASE [' + @temp_dbname + '] MODIFY FILE (NAME=[' + @temp_virtual_file_name + '], FILENAME=''' + @new_file_path +'\' +   @temp_file_name+ ''')';
insert into @scripts values(@num, @tmpString, 'TSQL')
set @tmpString = 'MOVE /Y "' + @old_file_location + '\' + @temp_file_name + '" "' + @new_file_path  +'\' + @temp_file_name + '"'
    insert into @scripts values(@num+@numrows+1, @tmpString, 'CMD')
   end

fetch next from c into @id
End

if @num > 0
begin
  set @tmpString = 'ALTER DATABASE [' + @temp_dbname +'] SET OFFLINE WITH ROLLBACK IMMEDIATE';
  insert into @scripts values(@numrows +1, @tmpString, 'TSQL')

  set @tmpString = 'ALTER DATABASE [' + @temp_dbname +'] SET ONLINE';
  insert into @scripts values(@numrows * 2 +2, @tmpString, 'TSQL')

end
/* Close and deallocate the cursor so we can use it again for dealing with log files */
close c
deallocate c

/* Display the scripts to run */
--select * from @scripts order by step

/* Executes the commands */

if @num > 0
begin
declare c cursor for select scriptText , commtype  from @scripts order by step
open c
fetch next from c into @command, @commtype

while @@fetch_Status=0 Begin
  if @commtype = 'TSQL'
begin
 
   insert into @message values((select count(*) from @message) + 1,  'Executing TSQL: ' + @command)
   EXECUTE sp_executesql  @Command;

end
  else
begin

       insert into @message values((select count(*) from @message) + 1,  'Executing XP Command: ' + @command)
   EXEC xp_cmdshell @Command;
 
end

fetch next from c into @command, @commtype

End
close c
deallocate c
end
else
insert into @message values((select count(*) from @message) + 1, 'DB Files are already in the desired Directories. Nothing to do!' )
end
select message_text  from @message order by msg_num
go

-------------------------  Script end ------------------------------


venerdì 6 maggio 2016

Some Considerations about SQL Server editions and pricing


Firts of all let me say I definitively love SQL Server and I'm really impressed about the product technical improvements in the last years, including the stunning 2016 Version. But at the same time I am worried about the SQL Server  market positioning on non-enterprise customers.
Standard Edition price is growing too much, by my opinion.
The free SQL Server Express edition is really too limited in terms of resources to work as a company central repository; Standard Edition can do this job well (altough the 128 GB RAM limitation can easly become a bottleneck), but, in many cases, affording a 10K licences price + the SA fee really becomes a serious obstacle  to the product adoption.
Please take in consideration that there are millions of small customers, using only SQL Server standard features as backend for their ERP, accounting software, stock management and so on, that your pricing strategy is pushing away (and this is not only true for SQL Server). 
So, although I am quite sure that my words will be lost in the wind, I would like to give some non-technical suggestion:
1) In SQL Server 2012 MS has introduced the BI Edition, why don’t cut off the Standard Edition  price as well as the BI features?
2) MS is still selling SQL Server as a product (license) and as a service (Software Assurance); now is giving free licenses to customers migrating from Oracle, but they still have to pay the service (SA). Why don't do a step forward and decide to sell the whole product, for all customers, as a service eliminating the initial licence price?
3) MS is already selling SQL Server nearly as a as a service in contexts like SPLAs,Enterprise Agreements and, of course, Azure
4) By my opinion Software Assurance is creating problems to both MS and Customers: MS is quite obligated to release a new Version every 2 years, and customers have to afford upgrading side costs; I think that adopting a more relaxed versioning can simpify the life to both parts
5) Software Assurance covers the whole product price in about 3 years, please consider the benefits of a costant lifetime entroy and don't be so aggressive in terms of fees
6) In the next years a lot of customers will move to cloud, Azure fee can be seen as a Computing service + a Software Service fee, now if your clients are already paying a monthly SQL Server fee it would be very easy to move from/to the public cloud to/from the private cloud simply adding or removing the computing fee
Other SQL Server experts at the same time are complaining about the Standard Editions features limitations.    
I think you have really developed a wonderfull product, but at the some time a discutable marketing strategy, that is positioning your product as an Enterprise solution even for the entry level editions; I hope I shouldn’t reskill myself on PostGress in the next years.

mercoledì 10 febbraio 2016

SQL Server: Audit Security events


Few days ago a customer asked me to implement a security auditing solution for their production instances of SQL Server, to track users and logins management events. I immediately thought about the native SQL Audit function, but there was a problem, all users gain access to SQL Server with the same login, so to identify exactly who I had to capture the hostname field; unfortunately, the available fields in the SQL Server audits are predefined and not mdifiable (at least not easily) and among them the hostname is missing.
At that point I decided to use DDL triggers, which unlike the DML, have not a significant impact in terms of performances.
First of all we have to understand which events intercept, using this query you can have a hierarchical view of available events:

WITH EventsReports (name, PARENT_TYPE, type, level, sort) AS
(
    SELECT CONVERT (varchar (255), type_name), PARENT_TYPE, type, 1, CONVERT (varchar (255), type_name)
    FROM sys.trigger_event_types 
    WHERE PARENT_TYPE  IS NULL
    UNION ALL
    SELECT CONVERT (varchar (255), REPLICATE ( '|', level) + e.type_name),
        e.parent_type, e.type, level + 1,
    CONVERT (varchar (255), RTRIM (sort) + '|' + e.type_name)
    FROM sys.trigger_event_types  as e
        INNER JOIN EventsReports as d
        ON  e.parent_type =  d.type
  )
SELECT PARENT_TYPE, type, name
FROM EventsReports
-- where [name] like '% object%'
ORDER BY sort;

The event name is quite explicative, but in the dubt has a look here

So we decided to track these events: DDL_DATABASE_SECURITY_EVENTS, DDL_DATABASE_EVENTS, DDL_SERVER_SECURITY_EVENTS

Then I have built the DDL trigger.

First let's create a DB and the table that will store our events:

Create DataBase AdminDB;
go
use AdminDB;
go
CREATE TABLE dbo.DDL_Security_Audit_LOG  ( EventId int NOT NULL IDENTITY (1, 1) PRIMARY KEY, EventTime datetime , DbName nvarchar(100), EventType nvarchar(100) , UserName nvarchar(100) ,HostName nvarchar (100) , EventTSQL nvarchar(3000) , Data_XML xml);


Then our DDL Trigger:


Create TRIGGER [DDL_Security_Audit]
ON ALL SERVER
FOR DDL_DATABASE_SECURITY_EVENTS, DDL_DATABASE_EVENTS, DDL_SERVER_SECURITY_EVENTS
AS
BEGIN

Declare @ObjectName nvarchar(100) =  EVENTDATA().value('(/EVENT_INSTANCE/ObjectName)[1]', 'nvarchar(100)')

Declare @EventType nvarchar(100) =  EVENTDATA().value('(/EVENT_INSTANCE/EventType)[1]', 'nvarchar(100)')

Declare @DbName  nvarchar(100) = isnull( EVENTDATA().value('(/EVENT_INSTANCE/DatabaseName)[1]', 'nvarchar(1000)'),'Master')

Declare @Command nvarchar(1000) =  EVENTDATA().value('(/EVENT_INSTANCE/TSQLCommand/CommandText)[1]', 'nvarchar(1000)')

INSERT INTO AdminDB.dbo.DDL_Security_Audit_LOG
 VALUES ( GETDATE() ,  @DbName, @EventType, suser_sname() , HOST_NAME(), @Command, EVENTDATA()  )

END
GO

Then remember to create a JOB to purge old data (if you want to)

Declare @CutOffDate datetime =  DATEDIFF(d, -30, getdate())
Delete from AdminDB.dbo.DDL_Audit_Security_TABLE_LOG where EventTime < @CutOffDate

Now you have all info you need to monitor your security events at Server and DataBase level.

Just notice in SQL Server there a security hole: no DDL events avalaible for disable Trigger action.

(Thanks to Alessandro, Matteo, Simone and "Rocco" for allowing me to publish this)