Showing posts with label script. Show all posts
Showing posts with label script. Show all posts

Thursday, March 29, 2012

Failed to script out Dropping Default Constraint script for table columns

I used the following code to script out Creating default constraints. It works fine. But if I change so.ScriptDrops = true to script Dropping default constraints, it always fails with

Exception: {"Script failed for DefaultConstraint 'DF_Employee_VacationHours'. "}

InnerException: {"Object reference not set to an instance of an object."}

Can anyone shred some light on it? Thanks in advance.

using System;

using System.Collections.Generic;

using System.Collections.Specialized;

using System.Text;

using Microsoft.SqlServer.Management.Smo;

using Microsoft.SqlServer.Management.Common;

namespace Test2

{

class Program

{

static void Main(string[] args)

{

Server s = new Server(".");

StringCollection sc;

ScriptingOptions so=new ScriptingOptions();

so.ScriptDrops = false;

s.ConnectionContext.SqlExecutionModes = SqlExecutionModes.CaptureSql;

foreach (Table t in s.Databases["Adventureworks"].Tables)

{

foreach (Column c in t.Columns)

{

if (c.DefaultConstraint!=null)

{

Console.WriteLine(string.Format("{0},{1}", t.Name, c.Name));

sc=c.DefaultConstraint.Script(so);

foreach (string st in sc)

{

Console.WriteLine(string.Format("{0}", st));

}

sc = null;

}

}

}

return;

}

}

}

Hi,

Thank you for reporting this issue. This is a defect in SMO, and we're filing a bug report. We will try to fix it for the next release of the product.

Failed to reserve contiguous memory

Hello,
(I have to repost it under different header since I didn't get any comment
or answer)
We faced a problem running PERL script which creates big XML file and
inserts it into ntext data type column of the table.
When XML file is relatively small – the process runs OK, but with big file
s
we get the following error appearing in the SQL server log:
2006-04-06 16:01:13.97 spid117 WARNING: Failed to reserve contiguous memory
of Size= 19333120.
2006-04-06 16:01:14.05 spid117 Query Memory Manager: Grants=5 Waiting=0
Maximum=152183 Available=151908
2006-04-06 16:01:14.05 spid117 Global Memory Objects: Resource=2921
Locks=29503 ...
2006-04-06 16:01:14.05 spid117 Dynamic Memory Manager: Stolen=34679 OS
Reserved=5272 ...
2006-04-06 16:01:14.05 spid117 Procedure Cache: TotalProcs=763
TotalPages=2142 InUsePages=899
2006-04-06 16:01:14.05 spid117 Buffer Counts: Commited=337792 Target=337792
Hashed=302526...
2006-04-06 16:01:14.05 spid117 Buffer Distribution: Stolen=32594 Free=530
Procedures=2142...
I checked several common things posted by Microsoft such as network packet
size and sql_variant data type – there is no any connection to our problem
.
Surprisingly, but the same insert works OK when performed from query
analyzer, even for a big files.
There quite amount of posts on the issue implying MemToLive
configuration, however this is a production server, I can't 'play' with it.
I would be very grateful for your help.
Thanks,If you can get the same hardware configuration and the same volume of
the same type of queries in a test environment, then you could probably
recreate your scenario there and "play" with fragmentation in
MemToLeave. When we called Microsoft, it came down to playing with the
-g parameter to increase the amount of memory allocated in MemToLeave
at startup. But still, that only works to a point because the
fragmentation isn't predictable - which is why it will work sometimes
(ie when you run it through QA) and fails at other times... We've just
had to push it back to developers to implement retries.
This just reinforces my opinion that all those nice little non-database
add-ons made to attract developers to SQL Server should really just
never be used...
BB|||If the same insert works ok when performed in QA, even for a big file.
That narrows the problem down to the PERL script rather. It could be a
time out issue. What library you use to connect to SQL Server in perl?
May be you should post more details in the PERL user group and see if
people have similar experiences.
Mel

Failed to reserve contiguous memory

Hello,
(I have to repost it under different header since I didn't get any comment
or answer)
We faced a problem running Perl script which creates big XML file and
inserts it into ntext data type column of the table.
When XML file is relatively small â' the process runs OK, but with big files
we get the following error appearing in the SQL server log:
2006-04-06 16:01:13.97 spid117 WARNING: Failed to reserve contiguous memory
of Size= 19333120.
2006-04-06 16:01:14.05 spid117 Query Memory Manager: Grants=5 Waiting=0
Maximum=152183 Available=151908
2006-04-06 16:01:14.05 spid117 Global Memory Objects: Resource=2921
Locks=29503 ...
2006-04-06 16:01:14.05 spid117 Dynamic Memory Manager: Stolen=34679 OS
Reserved=5272 ...
2006-04-06 16:01:14.05 spid117 Procedure Cache: TotalProcs=763
TotalPages=2142 InUsePages=899
2006-04-06 16:01:14.05 spid117 Buffer Counts: Commited=337792 Target=337792
Hashed=302526...
2006-04-06 16:01:14.05 spid117 Buffer Distribution: Stolen=32594 Free=530
Procedures=2142...
I checked several common things posted by Microsoft such as network packet
size and sql_variant data type â' there is no any connection to our problem.
Surprisingly, but the same insert works OK when performed from query
analyzer, even for a big files.
There quite amount of posts on the issue implying MemToLive
configuration, however this is a production server, I can't 'play' with it.
I would be very grateful for your help.
Thanks,If you can get the same hardware configuration and the same volume of
the same type of queries in a test environment, then you could probably
recreate your scenario there and "play" with fragmentation in
MemToLeave. When we called Microsoft, it came down to playing with the
-g parameter to increase the amount of memory allocated in MemToLeave
at startup. But still, that only works to a point because the
fragmentation isn't predictable - which is why it will work sometimes
(ie when you run it through QA) and fails at other times... We've just
had to push it back to developers to implement retries.
This just reinforces my opinion that all those nice little non-database
add-ons made to attract developers to SQL Server should really just
never be used...
BB|||If the same insert works ok when performed in QA, even for a big file.
That narrows the problem down to the perl script rather. It could be a
time out issue. What library you use to connect to SQL Server in perl?
May be you should post more details in the perl user group and see if
people have similar experiences.
Mel

Tuesday, March 27, 2012

Failed to generate SQL script

Hi, there,
I received the following message while trying to generate a sql script on
a table:
[SQL-DMO] CreateFile error on ServerName.DbName.DP1
How to fix this problem? Thanks.
Hi
Searching Google for this gives quite a few hits
http://tinyurl.com/a8xsx
And this one looks promissing:
http://tinyurl.com/d5mcq
John
"Kathy" wrote:

> Hi, there,
> I received the following message while trying to generate a sql script on
> a table:
> [SQL-DMO] CreateFile error on ServerName.DbName.DP1
> How to fix this problem? Thanks.
|||John, thank you very much for providing the information. It's very helpful
for me to further diagose the issue.
"John Bell" wrote:
[vbcol=seagreen]
> Hi
> Searching Google for this gives quite a few hits
> http://tinyurl.com/a8xsx
> And this one looks promissing:
> http://tinyurl.com/d5mcq
> John
> "Kathy" wrote:
sql

Failed to generate SQL script

Hi, there,
I received the following message while trying to generate a sql script on
a table:
[SQL-DMO] CreateFile error on ServerName.DbName.DP1
How to fix this problem? Thanks.Hi
Searching Google for this gives quite a few hits
http://tinyurl.com/a8xsx
And this one looks promissing:
http://tinyurl.com/d5mcq
John
"Kathy" wrote:
> Hi, there,
> I received the following message while trying to generate a sql script on
> a table:
> [SQL-DMO] CreateFile error on ServerName.DbName.DP1
> How to fix this problem? Thanks.|||John, thank you very much for providing the information. It's very helpful
for me to further diagose the issue.
"John Bell" wrote:
> Hi
> Searching Google for this gives quite a few hits
> http://tinyurl.com/a8xsx
> And this one looks promissing:
> http://tinyurl.com/d5mcq
> John
> "Kathy" wrote:
> > Hi, there,
> > I received the following message while trying to generate a sql script on
> > a table:
> > [SQL-DMO] CreateFile error on ServerName.DbName.DP1
> >
> > How to fix this problem? Thanks.

Failed to generate SQL script

Hi, there,
I received the following message while trying to generate a sql script on
a table:
[SQL-DMO] CreateFile error on ServerName.DbName.DP1
How to fix this problem? Thanks.Hi
Searching Google for this gives quite a few hits
http://tinyurl.com/a8xsx
And this one looks promissing:
http://tinyurl.com/d5mcq
John
"Kathy" wrote:

> Hi, there,
> I received the following message while trying to generate a sql script o
n
> a table:
> [SQL-DMO] CreateFile error on ServerName.DbName.DP1
> How to fix this problem? Thanks.|||John, thank you very much for providing the information. It's very helpful
for me to further diagose the issue.
"John Bell" wrote:
[vbcol=seagreen]
> Hi
> Searching Google for this gives quite a few hits
> http://tinyurl.com/a8xsx
> And this one looks promissing:
> http://tinyurl.com/d5mcq
> John
> "Kathy" wrote:
>

Friday, February 24, 2012

Extracting words from strings

I've done up this script which I'll turn into a user-defined function for
extracting words from text. However it seems a little over-complicated.
Anyone got any ideas of how to simplify it?
Thanks
Damien
SET NOCOUNT ON
DECLARE @.word TINYINT
DECLARE @.search_string VARCHAR( 2000 )
DECLARE @.delimiter CHAR( 1 )
-- Initialise variables
SET @.word = 1
SET @.search_string = 'This is a test function'
SET @.delimiter = ' '
SET @.search_string = @.search_string
DECLARE @.return VARCHAR( 25 )
DECLARE @.string CHAR( 1 )
DECLARE @.pos SMALLINT
DECLARE @.old_pos SMALLINT
DECLARE @.start SMALLINT
DECLARE @.total_len SMALLINT
DECLARE @.words SMALLINT
DECLARE @.len SMALLINT
-- Initialise variables
SET @.start = 1
SET @.old_pos = 1
SET @.pos = 1
SET @.words = 1
SET @.len = 0
SET @.total_len = LEN( @.search_string )
-- Check for errors
IF @.total_len = 0
BEGIN
RAISERROR( 'Invalid search string: %s.', 16, 1, @.search_string )
END
-- SELECT @.search_string AS '12345678901234'
-- Loop thru string one character at a time
WHILE @.pos <= @.total_len
BEGIN
SET @.string = SUBSTRING( @.search_string, @.pos, 1 )
-- Check we're not at the end of the string
IF @.word > @.words BREAK
IF @.pos = @.total_len
BEGIN
SET @.start = @.old_pos
SET @.len = @.pos - @.old_pos + 1
BREAK
END
-- Found a space; any previous text is the first word, any further text
is further word
IF @.string = @.delimiter
BEGIN
IF @.words = @.word
BEGIN
-- Current word is same as requested word
SET @.start = @.old_pos
SET @.len = @.pos - @.old_pos
BREAK
END
ELSE
BEGIN
SET @.old_pos = @.pos + 1
END
SET @.words = @.words + 1
END
-- Debuggin
--SELECT @.pos AS '@.pos', @.string AS '@.string', @.start AS '@.start',
@.old_pos AS '@.old_pos', @.word AS '@.word', @.words AS '@.words', @.len AS '@.len'
-- Increment current position
SET @.pos = @.pos + 1
END
-- Set return value
SET @.return = SUBSTRING( @.search_string, @.start, @.len )
function_exit:
-- RETURN @.return
SELECT @.return, LEN( @.return )
SET NOCOUNT OFFTry this....
if exists (select 'Y' from dbo.sysobjects where id =
object_id(N'[dbo].[fn_Split]'))
BEGIN
Drop Function fn_Split
END
GO
Create Function fn_Split
(
@.SplitString text,
@.Delimiter VARCHAR(1) = ' '
) RETURNS @.Dictionary TABLE (word varchar(8000)) AS
BEGIN
DECLARE
@.ParsedString VARCHAR(8000),
@.element VARCHAR(8000),
@.start_pos int,
@.end_pos int,
@.DataLength int,
@.BytesProcessed int,
@.length int
SELECT
@.BytesProcessed = 0,
@.DataLength = DataLength(@.SplitString)
SELECT @.ParsedString = Substring(@.SplitString, @.BytesProcessed + 1, 8000)
WHILE @.BytesProcessed < @.DataLength BEGIN
SELECT
@.BytesProcessed = @.BytesProcessed + DataLength(@.ParsedString)
SELECT
@.start_pos = 0
SELECT @.end_pos = CHARINDEX(@.Delimiter, @.ParsedString, @.start_pos + 1)
WHILE @.end_pos <> 0 BEGIN
SELECT @.length = (@.end_pos - @.start_pos) - 1
SELECT
@.element = SUBSTRING(@.ParsedString, @.start_pos + 1, @.length),
@.start_pos = @.end_pos
INSERT INTO @.Dictionary
VALUES (@.element)
SELECT @.end_pos = CHARINDEX(@.Delimiter, @.ParsedString, @.start_pos + 1)
END
SELECT @.ParsedString = SUBSTRING(@.ParsedString, @.start_pos + 1, 8000)
SELECT @.ParsedString = @.ParsedString + Substring(@.SplitString,
@.BytesProcessed + 1, 8000 - DataLength(@.ParsedString))
END
SELECT @.element = @.ParsedString
IF @.element IS NOT NULL
INSERT INTO @.Dictionary
VALUES (@.element)
RETURN
END
GO
select * from dbo.fn_Split('This is a test function',' ')|||I can't follow exactly what you're trying to do in that code but it looks
like you're trying to return the string of searched text from a string being
searched. For example, searching for "B" in "ABC".
In that case you could do something like:
DECLARE
@.string VARCHAR(10)
, @.search VARCHAR(10)
SELECT
@.string = 'ABC'
, @.search = 'B'
SELECT SUBSTRING( @.string, CHARINDEX(@.search, @.string), LEN(@.search) )
That code will return "B"; if you searched for "D" it would return an empty
string.
But doing that doesn't really make sense, because you already know the
string you're searching for. Couldn't you just use CHARINDEX(), and if it
returns > 0 reuse the string that is the text you're searching for?
Only caveat in all that is that it is not case sensitive for
case-insentitive collations.
"Damien" wrote:

> I've done up this script which I'll turn into a user-defined function for
> extracting words from text. However it seems a little over-complicated.
> Anyone got any ideas of how to simplify it?
> Thanks
>
> Damien
>
> SET NOCOUNT ON
> DECLARE @.word TINYINT
> DECLARE @.search_string VARCHAR( 2000 )
> DECLARE @.delimiter CHAR( 1 )
> -- Initialise variables
> SET @.word = 1
> SET @.search_string = 'This is a test function'
> SET @.delimiter = ' '
> SET @.search_string = @.search_string
> DECLARE @.return VARCHAR( 25 )
> DECLARE @.string CHAR( 1 )
> DECLARE @.pos SMALLINT
> DECLARE @.old_pos SMALLINT
> DECLARE @.start SMALLINT
> DECLARE @.total_len SMALLINT
> DECLARE @.words SMALLINT
> DECLARE @.len SMALLINT
> -- Initialise variables
> SET @.start = 1
> SET @.old_pos = 1
> SET @.pos = 1
> SET @.words = 1
> SET @.len = 0
> SET @.total_len = LEN( @.search_string )
> -- Check for errors
> IF @.total_len = 0
> BEGIN
> RAISERROR( 'Invalid search string: %s.', 16, 1, @.search_string )
> END
> -- SELECT @.search_string AS '12345678901234'
> -- Loop thru string one character at a time
> WHILE @.pos <= @.total_len
> BEGIN
> SET @.string = SUBSTRING( @.search_string, @.pos, 1 )
> -- Check we're not at the end of the string
> IF @.word > @.words BREAK
> IF @.pos = @.total_len
> BEGIN
> SET @.start = @.old_pos
> SET @.len = @.pos - @.old_pos + 1
> BREAK
> END
> -- Found a space; any previous text is the first word, any further tex
t
> is further word
> IF @.string = @.delimiter
> BEGIN
> IF @.words = @.word
> BEGIN
> -- Current word is same as requested word
> SET @.start = @.old_pos
> SET @.len = @.pos - @.old_pos
> BREAK
> END
> ELSE
> BEGIN
> SET @.old_pos = @.pos + 1
> END
> SET @.words = @.words + 1
> END
> -- Debuggin
> --SELECT @.pos AS '@.pos', @.string AS '@.string', @.start AS '@.start',
> @.old_pos AS '@.old_pos', @.word AS '@.word', @.words AS '@.words', @.len AS '@.le
n'
> -- Increment current position
> SET @.pos = @.pos + 1
> END
>
> -- Set return value
> SET @.return = SUBSTRING( @.search_string, @.start, @.len )
>
> function_exit:
> -- RETURN @.return
> SELECT @.return, LEN( @.return )
> SET NOCOUNT OFF
>|||Use a table of sequential numbers for this purpose instead of loop routines.
You can create one easily using the hack:
SELECT TOP 2000 IDENTITY ( INT ) n INTO Nbrs
FROM sysobjects s1, sysobjects s2 ;
Now your requirement is simpler:
DECLARE @.search_string VARCHAR( 2000 )
DECLARE @.delimiter CHAR( 1 )
SET @.search_string = 'This is a test function'
SET @.delimiter = ' '
SELECT SUBSTRING( @.search_string, n, CHARINDEX( @.delimiter,
@.search_string + @.delimiter, n ) - n )
FROM Nbrs
WHERE SUBSTRING( @.delimiter + @.search_string, n, 1 ) = @.delimiter
AND n < LEN( @.search_string ) + 1 ;
Anith

Sunday, February 19, 2012

extracting table schemas with nvarchar column(on 2000) from SSMS

Hi ,
when I use SSMS to extract the script for a table if the table is on a sql 2000 database and table has got nvarchar column then I will get following error message.

property Ansipaddingstatus is not available for column XXXX. this property may not exist
for this object,or may not be retrivable due to insufficent access rigths (microsoft.sqlserver.smo)
and I am sysadmin on both server

also I have to mention I have tried from another 2005 machine to extarct the script the same way and it did work. so the problem is do my sql 2005 edition.
Many thanksDifferent SP level ? The SMO classes are updated during the SP update as well.

HTH, Jens K. Suessmeyer.

http://www.sqlserver2005.de|||Thanks for your reply,
that's a very good point, which I should have thought before posting this issue
but
on the machine that it is failing I have got SP2 of SQL Server 2K5 installed and on the one that runs successfully I haven't got any Service pack!!!
which it doesn't make sence
Thanks|||Did you try having the same Service Pack level on both machines ?

Jens K. Suessmeyer.

http://www.sqlserver2005.de

Friday, February 17, 2012

Extracting Cube detailed info

Hello,

Could I write ActiveX Script in a DTS Package to return cube info (such as CubeName, CubeStatus, CubeSize, CubePartitions, CubeLastProcessed)? If that's
possible, I could then pump in the result into text file and import them into
a table for reporting purposes.

Please let me know if this is the best route to explore DSO objects or any alternative way of extracting this type of Cube data.

-Lawrence

If you are using Analysis Services 2000, yes, using DSO from a DTS package is a good option.

If you are using Analysis Services 2005, then use AMO (Microsoft.AnalysisServices.dll), I can post sample code, let me know please.

Adrian Dumitrascu

|||

Hello Adrian,

If you could post some sample DSO & AMO code for 2000 & 2005, that would be much appreciated. This would get me started as I don't know much about the object schema for DSO nor AMO. Let me know how to get to your code site.

Sincerely,

-Lawrence

|||

Hi,

here is a sample - without any warranty. For more information see http://msdn2.microsoft.com/en-us/library/ms345089.aspx:

using System;

using System.Collections.Generic;

using System.Text;

using System.Xml;

using System.IO;

using Microsoft.AnalysisServices;

namespace GetCubeInfo

{

class Program

{

static void Main(string[] args)

{

string dateFormat = "yyyy-MM-dd HH:mm:ss";

try

{

FileStream stream;

stream = File.Create(@."GetCubeInfo.xml");

XmlTextWriter writer = new XmlTextWriter(stream, Encoding.UTF8);

// Causes child elements to be indented

writer.Formatting = Formatting.Indented;

// Report element

writer.WriteProcessingInstruction("xml", "version=\"1.0\" encoding=\"utf-8\"");

writer.WriteStartElement("Server");

// Connect to the SSAS server

Server server = new Server();

server.Connect(@."Integrated Security=SSPI;Persist Security Info=False;Data Source=localhost\YUKON");

writer.WriteAttributeString("Name", null, server.Name);

writer.WriteAttributeString("ReportCreated", null, DateTime.Now.ToString(dateFormat));

// Get the Adventure Works cube(s)

foreach (Database database in server.Databases)

{

writer.WriteStartElement("Database");

writer.WriteAttributeString("Name", null, database.Name);

writer.WriteAttributeString("CreatedTimestamp", null, database.CreatedTimestamp.ToString(dateFormat));

writer.WriteAttributeString("LastSchemaUpdate", null, database.LastSchemaUpdate.ToString(dateFormat));

writer.WriteAttributeString("LastProcessed", null, database.LastProcessed.ToString(dateFormat));

writer.WriteAttributeString("LastUpdate", null, database.LastUpdate.ToString(dateFormat));

foreach (Cube cube in database.Cubes)

{

writer.WriteStartElement("Cube");

writer.WriteAttributeString("Name", null, cube.Name);

writer.WriteAttributeString("CreatedTimestamp", null, cube.CreatedTimestamp.ToString(dateFormat));

writer.WriteAttributeString("LastSchemaUpdate", null, cube.LastSchemaUpdate.ToString(dateFormat));

writer.WriteAttributeString("LastProcessed", null, cube.LastProcessed.ToString(dateFormat));

writer.WriteEndElement(); // Cube

}

foreach (Dimension dimension in database.Dimensions)

{

writer.WriteStartElement("Dimension");

writer.WriteAttributeString("Name", null, dimension.Name);

writer.WriteAttributeString("CreatedTimestamp", null, dimension.CreatedTimestamp.ToString(dateFormat));

writer.WriteAttributeString("LastSchemaUpdate", null, dimension.LastSchemaUpdate.ToString(dateFormat));

writer.WriteAttributeString("LastProcessed", null, dimension.LastProcessed.ToString(dateFormat));

writer.WriteEndElement(); // Dimension

}

writer.WriteEndElement(); // Database

writer.Flush();

}

writer.WriteEndElement(); // Server

writer.Flush();

}

catch (Exception exception)

{

// Uups

Console.WriteLine(exception.Message);

}

}

}

}

|||

Hello,

I really appreciate the AMO sample code! However, my cubes are currently in 2000, so I would wondering if someone could provide some sample DSO code like above. Output to text file would be good enough.

BTW, to setup automation for the AMO code above, would SSIS be the best place to setup the script? I am thinking either in "Script Component" or "ActiveX" tasks. Let me know if there is a better method.

Sincerely,

-Lawrence

|||

Hi Lawrence,

I strongly recommend you to use the Script Component if you run this code within a SSIS package, because you can use Visual Basic.Net and can use directly the (managed) AMO interface. The ActiveX task is only for backward compatibility reasons. For new code you should use the new ones otherwise you will have migration efforts in the future.

Whether to put it into a SSIS Package or not depends on your requirements and what you want to do with the data you have just gathered in that task. If it is part of your workflow/ETL process, or if you want to use some of the results in other steps/flows, or if you want to use the configuration support of SSIS, or if you want to use the scheduling support with the SQL Server Agent, or, or, or then the answer will be easy (Yes).

Regards,

Bertil

|||

Anyone could provide a sample script of the DSO objects that's related to what I am trying to gather here? I don't think AMO objects would work on 2000 cubes.

Your help is appreciated!

-Lawrence

Extracting Cube detailed info

Hello,

Could I write ActiveX Script in a DTS Package to return cube info (such as CubeName, CubeStatus, CubeSize, CubePartitions, CubeLastProcessed)? If that's
possible, I could then pump in the result into text file and import them into
a table for reporting purposes.

Please let me know if this is the best route to explore DSO objects or any alternative way of extracting this type of Cube data.

-Lawrence

If you are using Analysis Services 2000, yes, using DSO from a DTS package is a good option.

If you are using Analysis Services 2005, then use AMO (Microsoft.AnalysisServices.dll), I can post sample code, let me know please.

Adrian Dumitrascu

|||

Hello Adrian,

If you could post some sample DSO & AMO code for 2000 & 2005, that would be much appreciated. This would get me started as I don't know much about the object schema for DSO nor AMO. Let me know how to get to your code site.

Sincerely,

-Lawrence

|||

Hi,

here is a sample - without any warranty. For more information see http://msdn2.microsoft.com/en-us/library/ms345089.aspx:

using System;

using System.Collections.Generic;

using System.Text;

using System.Xml;

using System.IO;

using Microsoft.AnalysisServices;

namespace GetCubeInfo

{

class Program

{

static void Main(string[] args)

{

string dateFormat = "yyyy-MM-dd HH:mm:ss";

try

{

FileStream stream;

stream = File.Create(@."GetCubeInfo.xml");

XmlTextWriter writer = new XmlTextWriter(stream, Encoding.UTF8);

// Causes child elements to be indented

writer.Formatting = Formatting.Indented;

// Report element

writer.WriteProcessingInstruction("xml", "version=\"1.0\" encoding=\"utf-8\"");

writer.WriteStartElement("Server");

// Connect to the SSAS server

Server server = new Server();

server.Connect(@."Integrated Security=SSPI;Persist Security Info=False;Data Source=localhost\YUKON");

writer.WriteAttributeString("Name", null, server.Name);

writer.WriteAttributeString("ReportCreated", null, DateTime.Now.ToString(dateFormat));

// Get the Adventure Works cube(s)

foreach (Database database in server.Databases)

{

writer.WriteStartElement("Database");

writer.WriteAttributeString("Name", null, database.Name);

writer.WriteAttributeString("CreatedTimestamp", null, database.CreatedTimestamp.ToString(dateFormat));

writer.WriteAttributeString("LastSchemaUpdate", null, database.LastSchemaUpdate.ToString(dateFormat));

writer.WriteAttributeString("LastProcessed", null, database.LastProcessed.ToString(dateFormat));

writer.WriteAttributeString("LastUpdate", null, database.LastUpdate.ToString(dateFormat));

foreach (Cube cube in database.Cubes)

{

writer.WriteStartElement("Cube");

writer.WriteAttributeString("Name", null, cube.Name);

writer.WriteAttributeString("CreatedTimestamp", null, cube.CreatedTimestamp.ToString(dateFormat));

writer.WriteAttributeString("LastSchemaUpdate", null, cube.LastSchemaUpdate.ToString(dateFormat));

writer.WriteAttributeString("LastProcessed", null, cube.LastProcessed.ToString(dateFormat));

writer.WriteEndElement(); // Cube

}

foreach (Dimension dimension in database.Dimensions)

{

writer.WriteStartElement("Dimension");

writer.WriteAttributeString("Name", null, dimension.Name);

writer.WriteAttributeString("CreatedTimestamp", null, dimension.CreatedTimestamp.ToString(dateFormat));

writer.WriteAttributeString("LastSchemaUpdate", null, dimension.LastSchemaUpdate.ToString(dateFormat));

writer.WriteAttributeString("LastProcessed", null, dimension.LastProcessed.ToString(dateFormat));

writer.WriteEndElement(); // Dimension

}

writer.WriteEndElement(); // Database

writer.Flush();

}

writer.WriteEndElement(); // Server

writer.Flush();

}

catch (Exception exception)

{

// Uups

Console.WriteLine(exception.Message);

}

}

}

}

|||

Hello,

I really appreciate the AMO sample code! However, my cubes are currently in 2000, so I would wondering if someone could provide some sample DSO code like above. Output to text file would be good enough.

BTW, to setup automation for the AMO code above, would SSIS be the best place to setup the script? I am thinking either in "Script Component" or "ActiveX" tasks. Let me know if there is a better method.

Sincerely,

-Lawrence

|||

Hi Lawrence,

I strongly recommend you to use the Script Component if you run this code within a SSIS package, because you can use Visual Basic.Net and can use directly the (managed) AMO interface. The ActiveX task is only for backward compatibility reasons. For new code you should use the new ones otherwise you will have migration efforts in the future.

Whether to put it into a SSIS Package or not depends on your requirements and what you want to do with the data you have just gathered in that task. If it is part of your workflow/ETL process, or if you want to use some of the results in other steps/flows, or if you want to use the configuration support of SSIS, or if you want to use the scheduling support with the SQL Server Agent, or, or, or then the answer will be easy (Yes).

Regards,

Bertil

|||

Anyone could provide a sample script of the DSO objects that's related to what I am trying to gather here? I don't think AMO objects would work on 2000 cubes.

Your help is appreciated!

-Lawrence

Extracting Contents of Object Variable in script Task

Hey Guys,

Im pretty new to Scripting in SSIS, but i have worked a fair bit out already, although i am stuck with the following problem.. If anyone can help out that would be great!!

Ok, i have a package that loops through records in a table, and extracts the detailed lines in a For Each loop. and stores them in a variable called DetailRecordSet with an data type of object.

I have a built a script task so that i can send out an email confirmation for each order, and i want to list the details in a HTML Table in the body of the message ( I have read that the standard send mail task wont support this, but found an example of using .NET code to generate the email message which will support it) .


What i want to know is how do i reference the columns in the Object variable so i can extract each line and add it into my string that i am creating with HTML codes.. I

e:( str = "<TABLE> <TR>" + Cstr(Variable for 1st column in recordset) + " </TR>"

if any can help my, that would be most appriciated.. Ive tried all different things that ive found on the net, but nothing is working.

Thanks in Advance

Scotty

To know how to interact with SSIS object variable's underlying type, you need to know the type itself. That is, the type of the "thing" stored by reference in the SSIS object variable.

What the actual underlying type of the variable stored in the SSIS variable of type object?

System.Windows.Forms.MessageBox.Show(Dts.Variables("DetailRecordSet").Value.GetType().FullName)

The above MessageBox() will display the underlying type in your script task, excepting Com callable wrappers (of type System.__Object).
|||

Sorry for the late reply Jaegd. but have been away for a few days,,

This object type is System.Object. from the variable section i select the data type to be Object.

I appricate your help still, cause i am still stuck

Thanks

Scotty

|||

As per the previous reply, knowing what is in the object is crucial. From your reply it is a COM object, so I'll guess that it is probably an ADO recordset. IN which case take a look here for some links and pointers - http://blogs.conchango.com/jamiethomson/archive/2005/02/08/SSIS_3A00_-Accessing-a-recordset-in-a-script-task.aspx

If that does not help, then help me, as I don't understand how you got there. So what has populated the variable exactly, and what method was used? If it is the For Each loop, what enumerator have you used?