Showing posts with label text. Show all posts
Showing posts with label text. Show all posts

Friday, March 23, 2012

Failed to create snapshot while bulk copy large table with TEXT/IMAGE field

Dear all,

Please help us on the snapshot bulk copy issue with a TEXT
field. The snapshot agent was failed with below message:

The process could not bulk copy into table '"Enclosure"'

We have a big table about 170,000 records that needs to
replicate to other sites. There is a TEXT field in this
table. Here is the table schema..

CREATE TABLE [dbo].[Enclosure] (
[Identifier] [uniqueidentifier] NOT NULL ,
[sCommentId] [varchar] (15) COLLATE
Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[sEncName] [varchar] (50) COLLATE
Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[sUserId] [varchar] (15) COLLATE
Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[dUpdateTime] [datetime] NOT NULL ,
[sContent] [text] COLLATE Chinese_Taiwan_Stroke_CI_AS
NULL ,
[msrepl_tran_version] [uniqueidentifier] NOT NULL ,
[rowguid] uniqueidentifier ROWGUIDCOL NOT NULL
) ON [EXT_1] TEXTIMAGE_ON [EXT_1]

The "Identifier" field is an auto-generated primary key
(by newid()) of this table. The table is built on SQL
server 2000 SP3 over Windows 2000 sever SP4.

For this problem, I've searched in the news group and
google for the solution. I would summarize them as below:

1. Check disk space:
Our disk space should be enough. We have 3GB in system
partition and 120GB in data partition. All snapshot and
database file are located on the data partition. The data
file of this database that need to replicate is only 170MB.

2. Try BCP.exe command line
We tried to execute
[ bcp "select * from repl_test.dbo.Enclosure" queryout
snapshot.txt ]
in the command line. It can finish successfully but it
showed that it only bulk copy 36739 records out of 170,000
records. We do not have any idea about this result.

3. Make the TEXT field to be non-NULL
We fill an empty character ('') for the NULL value in the
TEXT field but the snapshot still cannot create
successfully.

4. Change configuration "max text repl size (B)" with
sp_configure
We check the maximum length of the TEXT field by [ select
max(datalength(sContent)) from Enclosure ]. Then, we
use "sp_configure" to set the "configure_value" of "max
text repl size (B)" entry to be 2 times of the maximum
length of the TEXT field. However, it shill cannot work!

Above suggestions did not work in our problem. The
snapshot still cannot create successfully. Finally I try
to modify the data type of TEXT field to be VARCHAR(7500)
then it can create snapshot successfully. Although it can
work, it's not really what we need.

I also tried to reduce the row number to see if bulk copy
has row limitation. When the row number become smaller
(about 10,000 records), the snapshot can create
successfully with the TEXT field.

I am not sure it's a bug or limitation of SQL server
replication. Could anybody help us on this urgent issue?
We are exhausted on this problem :~~. We need your advice
about this issue.

Thanks for your help!!

Best regards,
Louis
.Howdy

Check the disk space for the Distribution database - if its small ( monitor space taken for 5000 rows then 10000 rows ) , try moving it to a bigger drive if need be.

You may also want to check the snap shot replication folder location - again it could be a space issue.

See:
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/replsql/replplan_1l4e.asp

How to specify alternate snapshot locations (Enterprise Manager):

In SQL Server Enterprise Manager, expand the Replication and Publications directory, select a publication, right-click the publication, and then click Properties.

On the Snapshot Location tab, select Generate snapshots in the following location option, and then type a Universal Naming Convention path or click the browse button (...) and browse for the location where you want to save snapshot files.

Cheers

SG|||HI SG,

Thanks for your answer. I just adjust the file location of "distribution" and snapshot folder to a 120GB free disk partition. The source database is only 170MB. Beside this, the winnt\temp directory has 3GB free in system partition. I think the disk space should be OK.

I just tried the bcp.exe command again. I found it only can bulk copy 36188 records out of original 170000 records. All data length of the TEXT file and total data rows are under snapshot agent's defaults.

bcp command:

bcp repl_test..Enclosure out enclosure.bcp /Usa

Defaults of snapshot agent :

M i c r o s o f t S Q L S e r v e r S n a p s h o t A g e n t 8 . 0 0 . 7 6 0
C o p y r i g h t ( c ) 2 0 0 0 M i c r o s o f t C o r p o r a t i o n
M i c r o s o f t S Q L S e r v e r R e p l i c a t i o n A g e n t : C R M 2 - r e p l _ t e s t - r e p l _ t e s t - 1 1

S e r v e r :

D B M S : M i c r o s o f t S Q L S e r v e r
V e r s i o n : 0 8 . 0 0 . 0 7 6 0
u s e r n a m e : d b o
A P I c o n f o r m a n c e : 2
S Q L c o n f o r m a n c e : 1
t r a n s a c t i o n c a p a b l e : 2
r e a d o n l y : N
i d e n t i f i e r q u o t e c h a r : "
n o n _ n u l l a b l e _ c o l u m n s : 1
o w n e r u s a g e : 3 1
m a x t a b l e n a m e l e n : 1 2 8
m a x c o l u m n n a m e l e n : 1 2 8
n e e d l o n g d a t a l e n : Y
m a x c o l u m n s i n t a b l e : 1 0 2 4
m a x c o l u m n s i n i n d e x : 1 6
m a x c h a r l i t e r a l l e n : 5 2 4 2 8 8
m a x s t a t e m e n t l e n : 5 2 4 2 8 8
m a x r o w s i z e : 5 2 4 2 8 8|||Additional information for the bcp.exe. I think I find the way to solve my problem.

First, I executed

bcp repl_test..Enclosure out enclosure.bcp /Usa

It only can bulk copy 36188 records out of original 170000 records. So I try non-text native bulk copy with /N switch of bcp.exe

bcp repl_test..Enclosure out enclosure.bcp /Usa /N

With non-text native bulk copy, it can copy all 170000 records successfully. Then, I modify snapshot settings in Enterprise manager to use non-text native mode to create snapshot. The snapshop then can finished successfully!! :)

I am not sure if any side effect to use non-text native snapshot creation for TEXT and IMAGE field. I will double check the result and post it for everyone that interested on this issue.

Thanks for your concern and help!!

Louis|||Hello louisyfwang,

With bcp you can set the batch size ( -b option) so that it commits blocks of rows, and breakes the bulk into multiple transaction instead of attempting to transfer all at once. Essence this is mainly done in large size tables.

Hope this helps.

Jose R.

Originally posted by louisyfwang
Additional information for the bcp.exe. I think I find the way to solve my problem.

First, I executed

bcp repl_test..Enclosure out enclosure.bcp /Usa

It only can bulk copy 36188 records out of original 170000 records. So I try non-text native bulk copy with /N switch of bcp.exe

bcp repl_test..Enclosure out enclosure.bcp /Usa /N

With non-text native bulk copy, it can copy all 170000 records successfully. Then, I modify snapshot settings in Enterprise manager to use non-text native mode to create snapshot. The snapshop then can finished successfully!! :)

I am not sure if any side effect to use non-text native snapshot creation for TEXT and IMAGE field. I will double check the result and post it for everyone that interested on this issue.

Thanks for your concern and help!!

Louis|||Hi Jose,

Thanks for your suggestion. I can use the option of "-b" in the bcp.exe command line. Is there any way to specify this options in snapshot agent? I cannot find any "bcp.exe" command in the agent properties.

Best regards,
Louis

Friday, March 9, 2012

fail an execute sql task

I have a dts package that is essentially three transformation tasks that creates 3 text files. The transform data is accomplished through an sql query. Before I execute the transform tasks I would like to peform a couple of simple existence checks to verify data: i.e,

if exists (select * from table where week_ending = endoflastweek(getdate())

I could put the statement into the sql (if exists, begin/end) of each of the transformation tasks and prevent the process from executing, but it would be easier to put it into an "execute sql task" and if it succeeds, fail the task or make some kind of declaration not to proceed. I'm not sure how to do this...I know this can be done with an activeX script,

Main = DTSTaskExecResult_Failure

Is there a way to do this in an execute sql task step?I am not sure if you are looking for raiserror or return here. Look up both in Books Online, and see what you think. I am leaning raiserror.

Sunday, February 26, 2012

Extremely long lines of text

I have a dtsx package that works fine with one exception. When I open the dtsx package in BI, it gives me the following message:

Document contains one or more extremely long lines of text. These lines will cause the editor to respond slowly when you open the file. Do you still want to open the file?

When I respond yes, the package opens and I can edit or execute with no problem. Still, I want to understand what could cause this message to occur and, more importantly, how I can get rid of the message. When I try to simply execute the package I still get the same error and it seems this will be a problem for trying to run the package from SQL Server agent.

It seems likely to me that this message refers to the dtsx file (in xml format) itself. Does that make any sense?

This issue has been discussed in previous posts:

"Document contains one or more extremely long lines of text. These lines will cause the editor to respond slowly when you open the file. Do you still want to open the file?"

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=313598&SiteID=1

|||Thanks for the information. I obviously did not use the right terms when I searched.

Extremely long lines

Hi,

When i open a project in ssis show the message:

"Document contains one or more extremely long lines of text. These lines will cause the editor to respond slowly when you open the file. Do you still want to open the file."

I click Yes and my project open normally. Someone know why this happen? My project is small, have one package with any imports excel files to Sql Server 2005.

Thanks

Andr Rentes

Hi

Have a look at this link from the forum. Hope it helps:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=313598&SiteID=1

Cheers,

Grant

Friday, February 24, 2012

Extracting XML data to columns during query

I'm working with an inherited SQL 2000 database (now moved to 2005)
that stores strings of XML in a text column. I'd like to avoid
changing the schema right now. Each row's XML data looks something
like:
<root><element id='5' name='bob' message='hello' /></root>
The exact attributes in the inner element are unknown, but what I
would like to be able to do is return them as columns in a query such
that I would get:
id name message
-- -- --
5 bob hello
Any thoughts?I was just now experimenting with something similar. Most of this is right
out of the MSDN help. This is reading from an external XML file.
-- create tables for later population using OPENXML.
create table Customers (CustomerID varchar(20) primary key,
ContactName varchar(20),
CompanyName varchar(20))
go
create table Orders( CustomerID varchar(20), OrderDate datetime)
go
declare @.xmlDocument xml
select @.xmlDocument = cast (x as xml)
from OpenRowset (bulk 'P:\SQL scripts\Examples\XML\SourceOfXml.xml',
single_blob) R (x)
select @.xmlDocument
-- Contents of the source XML file.
--
-- <?xml version="1.0" encoding="windows-1252" ?>
-- <ROOT>
-- <Customers CustomerID="XYZAA" ContactName="Joe" CompanyName="Company1">
-- <Orders CustomerID="XYZAA" OrderDate="2000-08-25T00:00:00"/>
-- <Orders CustomerID="XYZAA" OrderDate="2000-10-03T00:00:00"/>
-- </Customers>
-- <Customers CustomerID="XYZBB" ContactName="Steve"
CompanyName="Company2">No Orders yet!
-- </Customers>
-- </ROOT>
declare @.docHandle int
exec sp_xml_preparedocument @.docHandle output, @.xmlDocument
-- Use OpenXML to provide rowset consisting of customer data.
insert Customers
select *
from OpenXML (@.docHandle, N'/ROOT/Customers')
with Customers
-- Use OpenXML to provide rowset consisting of order data.
insert Orders
select *
from OpenXML (@.docHandle, N'//Orders')
with Orders
-- Using OpenXML in a SELECT statement.
select *
from OpenXML (@.docHandle, N'/ROOT/Customers/Orders')
with (CustomerID nchar (5) '../@.CustomerID', OrderDate datetime)
-- Remove the internal representation of the XML document.
exec sp_xml_removedocument @.docHandle
/*
drop table Customers
drop table Orders
*/
"Brian Vallelunga" wrote:

> I'm working with an inherited SQL 2000 database (now moved to 2005)
> that stores strings of XML in a text column. I'd like to avoid
> changing the schema right now. Each row's XML data looks something
> like:
> <root><element id='5' name='bob' message='hello' /></root>
> The exact attributes in the inner element are unknown, but what I
> would like to be able to do is return them as columns in a query such
> that I would get:
> id name message
> -- -- --
> 5 bob hello
>
> Any thoughts?
>|||Hello Brian,

> I'm working with an inherited SQL 2000 database (now moved to 2005)
> that stores strings of XML in a text column. I'd like to avoid
> changing the schema right now. Each row's XML data looks something
> like:
> The exact attributes in the inner element are unknown, but what I
> would like to be able to do is return them as columns in a query such
> that I would get:
Theres lots of ways of doing this is the number of attributes are know, but
when they aren't, your going to have a hard time shaping them to a meaningfu
l
table. So whate exactly do you mean by "attributes are unknown?"
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/|||On Apr 3, 11:17 pm, Kent Tegels <kteg...@.develop.com> wrote:
> Hello Brian,
>
> Theres lots of ways of doing this is the number of attributes are know, bu
t
> when they aren't, your going to have a hard time shaping them to a meaning
ful
> table. So whate exactly do you mean by "attributes are unknown?"
> Thanks!
> Kent Tegels
> DevelopMentorhttp://staff.develop.com/ktegels/
Well, the xml data holds responses to online forms. Each form may have
different fields and each form field results in a single attribute key/
value pair. If a form has a first name and last name, the attributes
for these two would show up in the XML.
Obviously this makes it a bit more difficult than if the values were
known. However, for this, we can assume that any given set of data
pulled will be for one particular form, and will thus have the same
attributes in the XML data. I may just have to do this on the client
end, but thought I'd see if a SQL method was available.

Extracting XML data to columns during query

I'm working with an inherited SQL 2000 database (now moved to 2005)
that stores strings of XML in a text column. I'd like to avoid
changing the schema right now. Each row's XML data looks something
like:
<root><element id='5' name='bob' message='hello' /></root>
The exact attributes in the inner element are unknown, but what I
would like to be able to do is return them as columns in a query such
that I would get:
id name message
-- -- --
5 bob hello
Any thoughts?
I was just now experimenting with something similar. Most of this is right
out of the MSDN help. This is reading from an external XML file.
-- create tables for later population using OPENXML.
create table Customers (CustomerID varchar(20) primary key,
ContactName varchar(20),
CompanyName varchar(20))
go
create table Orders( CustomerID varchar(20), OrderDate datetime)
go
declare @.xmlDocument xml
select @.xmlDocument = cast (x as xml)
from OpenRowset (bulk 'P:\SQL scripts\Examples\XML\SourceOfXml.xml',
single_blob) R (x)
select @.xmlDocument
-- Contents of the source XML file.
-- <?xml version="1.0" encoding="windows-1252" ?>
-- <ROOT>
-- <Customers CustomerID="XYZAA" ContactName="Joe" CompanyName="Company1">
-- <Orders CustomerID="XYZAA" OrderDate="2000-08-25T00:00:00"/>
-- <Orders CustomerID="XYZAA" OrderDate="2000-10-03T00:00:00"/>
-- </Customers>
-- <Customers CustomerID="XYZBB" ContactName="Steve"
CompanyName="Company2">No Orders yet!
-- </Customers>
-- </ROOT>
declare @.docHandle int
exec sp_xml_preparedocument @.docHandle output, @.xmlDocument
-- Use OpenXML to provide rowset consisting of customer data.
insert Customers
select *
from OpenXML (@.docHandle, N'/ROOT/Customers')
with Customers
-- Use OpenXML to provide rowset consisting of order data.
insert Orders
select *
from OpenXML (@.docHandle, N'//Orders')
with Orders
-- Using OpenXML in a SELECT statement.
select *
from OpenXML (@.docHandle, N'/ROOT/Customers/Orders')
with (CustomerID nchar (5) '../@.CustomerID', OrderDate datetime)
-- Remove the internal representation of the XML document.
exec sp_xml_removedocument @.docHandle
/*
drop table Customers
drop table Orders
*/
"Brian Vallelunga" wrote:

> I'm working with an inherited SQL 2000 database (now moved to 2005)
> that stores strings of XML in a text column. I'd like to avoid
> changing the schema right now. Each row's XML data looks something
> like:
> <root><element id='5' name='bob' message='hello' /></root>
> The exact attributes in the inner element are unknown, but what I
> would like to be able to do is return them as columns in a query such
> that I would get:
> id name message
> -- -- --
> 5 bob hello
>
> Any thoughts?
>
|||Hello Brian,

> I'm working with an inherited SQL 2000 database (now moved to 2005)
> that stores strings of XML in a text column. I'd like to avoid
> changing the schema right now. Each row's XML data looks something
> like:
> The exact attributes in the inner element are unknown, but what I
> would like to be able to do is return them as columns in a query such
> that I would get:
Theres lots of ways of doing this is the number of attributes are know, but
when they aren't, your going to have a hard time shaping them to a meaningful
table. So whate exactly do you mean by "attributes are unknown?"
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/
|||On Apr 3, 11:17 pm, Kent Tegels <kteg...@.develop.com> wrote:
> Hello Brian,
>
> Theres lots of ways of doing this is the number of attributes are know, but
> when they aren't, your going to have a hard time shaping them to a meaningful
> table. So whate exactly do you mean by "attributes are unknown?"
> Thanks!
> Kent Tegels
> DevelopMentorhttp://staff.develop.com/ktegels/
Well, the xml data holds responses to online forms. Each form may have
different fields and each form field results in a single attribute key/
value pair. If a form has a first name and last name, the attributes
for these two would show up in the XML.
Obviously this makes it a bit more difficult than if the values were
known. However, for this, we can assume that any given set of data
pulled will be for one particular form, and will thus have the same
attributes in the XML data. I may just have to do this on the client
end, but thought I'd see if a SQL method was available.

Extracting XML data from SQL 2000 to SQL 2005 database table.

Hi,

We need to extract data from XML that is contained in TEXT columns in Sql Server 2000 database table. Can this be done easily with SSIS (NB we cannot use schemas easily with our data)? Please kindly provide us the solution in order to resolve this problem.

Thanks

VDeevi wrote:

Hi,

We need to extract data from XML that is contained in TEXT columns in Sql Server 2000 database table. Can this be done easily with SSIS (NB we cannot use schemas easily with our data)? Please kindly provide us the solution in order to resolve this problem.

Thanks

What methods have you tried so far and why do they fail?

-Jamie|||Hi Jamie,

Thanks for your reply, We tried with OPENXML, sp_xml_prepareddocument etc. but they are all round about things we want to know is there any simple technique in SSIS in order to retrieve the data from XML stored in SQL Server 2000 TEXT field.

Thanks,|||I've just had a go at this...

When you select the data a TEXT field appears in the SSIS pipeline as a DT_TEXT. To insert it into a XML field use a derived column transform to convert it to DT_NTEXT.
If you don't like DT_NTEXTs in your pipeline then convert it to DT_WSTR and then to DT_STR which can also be inserted into an XML field.

So yes, SSIS can do this very very well.

-Jamie

Extracting words from strings

I've done up this script which I'll turn into a user-defined function for
extracting words from text. However it seems a little over-complicated.
Anyone got any ideas of how to simplify it?
Thanks
Damien
SET NOCOUNT ON
DECLARE @.word TINYINT
DECLARE @.search_string VARCHAR( 2000 )
DECLARE @.delimiter CHAR( 1 )
-- Initialise variables
SET @.word = 1
SET @.search_string = 'This is a test function'
SET @.delimiter = ' '
SET @.search_string = @.search_string
DECLARE @.return VARCHAR( 25 )
DECLARE @.string CHAR( 1 )
DECLARE @.pos SMALLINT
DECLARE @.old_pos SMALLINT
DECLARE @.start SMALLINT
DECLARE @.total_len SMALLINT
DECLARE @.words SMALLINT
DECLARE @.len SMALLINT
-- Initialise variables
SET @.start = 1
SET @.old_pos = 1
SET @.pos = 1
SET @.words = 1
SET @.len = 0
SET @.total_len = LEN( @.search_string )
-- Check for errors
IF @.total_len = 0
BEGIN
RAISERROR( 'Invalid search string: %s.', 16, 1, @.search_string )
END
-- SELECT @.search_string AS '12345678901234'
-- Loop thru string one character at a time
WHILE @.pos <= @.total_len
BEGIN
SET @.string = SUBSTRING( @.search_string, @.pos, 1 )
-- Check we're not at the end of the string
IF @.word > @.words BREAK
IF @.pos = @.total_len
BEGIN
SET @.start = @.old_pos
SET @.len = @.pos - @.old_pos + 1
BREAK
END
-- Found a space; any previous text is the first word, any further text
is further word
IF @.string = @.delimiter
BEGIN
IF @.words = @.word
BEGIN
-- Current word is same as requested word
SET @.start = @.old_pos
SET @.len = @.pos - @.old_pos
BREAK
END
ELSE
BEGIN
SET @.old_pos = @.pos + 1
END
SET @.words = @.words + 1
END
-- Debuggin
--SELECT @.pos AS '@.pos', @.string AS '@.string', @.start AS '@.start',
@.old_pos AS '@.old_pos', @.word AS '@.word', @.words AS '@.words', @.len AS '@.len'
-- Increment current position
SET @.pos = @.pos + 1
END
-- Set return value
SET @.return = SUBSTRING( @.search_string, @.start, @.len )
function_exit:
-- RETURN @.return
SELECT @.return, LEN( @.return )
SET NOCOUNT OFFTry this....
if exists (select 'Y' from dbo.sysobjects where id =
object_id(N'[dbo].[fn_Split]'))
BEGIN
Drop Function fn_Split
END
GO
Create Function fn_Split
(
@.SplitString text,
@.Delimiter VARCHAR(1) = ' '
) RETURNS @.Dictionary TABLE (word varchar(8000)) AS
BEGIN
DECLARE
@.ParsedString VARCHAR(8000),
@.element VARCHAR(8000),
@.start_pos int,
@.end_pos int,
@.DataLength int,
@.BytesProcessed int,
@.length int
SELECT
@.BytesProcessed = 0,
@.DataLength = DataLength(@.SplitString)
SELECT @.ParsedString = Substring(@.SplitString, @.BytesProcessed + 1, 8000)
WHILE @.BytesProcessed < @.DataLength BEGIN
SELECT
@.BytesProcessed = @.BytesProcessed + DataLength(@.ParsedString)
SELECT
@.start_pos = 0
SELECT @.end_pos = CHARINDEX(@.Delimiter, @.ParsedString, @.start_pos + 1)
WHILE @.end_pos <> 0 BEGIN
SELECT @.length = (@.end_pos - @.start_pos) - 1
SELECT
@.element = SUBSTRING(@.ParsedString, @.start_pos + 1, @.length),
@.start_pos = @.end_pos
INSERT INTO @.Dictionary
VALUES (@.element)
SELECT @.end_pos = CHARINDEX(@.Delimiter, @.ParsedString, @.start_pos + 1)
END
SELECT @.ParsedString = SUBSTRING(@.ParsedString, @.start_pos + 1, 8000)
SELECT @.ParsedString = @.ParsedString + Substring(@.SplitString,
@.BytesProcessed + 1, 8000 - DataLength(@.ParsedString))
END
SELECT @.element = @.ParsedString
IF @.element IS NOT NULL
INSERT INTO @.Dictionary
VALUES (@.element)
RETURN
END
GO
select * from dbo.fn_Split('This is a test function',' ')|||I can't follow exactly what you're trying to do in that code but it looks
like you're trying to return the string of searched text from a string being
searched. For example, searching for "B" in "ABC".
In that case you could do something like:
DECLARE
@.string VARCHAR(10)
, @.search VARCHAR(10)
SELECT
@.string = 'ABC'
, @.search = 'B'
SELECT SUBSTRING( @.string, CHARINDEX(@.search, @.string), LEN(@.search) )
That code will return "B"; if you searched for "D" it would return an empty
string.
But doing that doesn't really make sense, because you already know the
string you're searching for. Couldn't you just use CHARINDEX(), and if it
returns > 0 reuse the string that is the text you're searching for?
Only caveat in all that is that it is not case sensitive for
case-insentitive collations.
"Damien" wrote:

> I've done up this script which I'll turn into a user-defined function for
> extracting words from text. However it seems a little over-complicated.
> Anyone got any ideas of how to simplify it?
> Thanks
>
> Damien
>
> SET NOCOUNT ON
> DECLARE @.word TINYINT
> DECLARE @.search_string VARCHAR( 2000 )
> DECLARE @.delimiter CHAR( 1 )
> -- Initialise variables
> SET @.word = 1
> SET @.search_string = 'This is a test function'
> SET @.delimiter = ' '
> SET @.search_string = @.search_string
> DECLARE @.return VARCHAR( 25 )
> DECLARE @.string CHAR( 1 )
> DECLARE @.pos SMALLINT
> DECLARE @.old_pos SMALLINT
> DECLARE @.start SMALLINT
> DECLARE @.total_len SMALLINT
> DECLARE @.words SMALLINT
> DECLARE @.len SMALLINT
> -- Initialise variables
> SET @.start = 1
> SET @.old_pos = 1
> SET @.pos = 1
> SET @.words = 1
> SET @.len = 0
> SET @.total_len = LEN( @.search_string )
> -- Check for errors
> IF @.total_len = 0
> BEGIN
> RAISERROR( 'Invalid search string: %s.', 16, 1, @.search_string )
> END
> -- SELECT @.search_string AS '12345678901234'
> -- Loop thru string one character at a time
> WHILE @.pos <= @.total_len
> BEGIN
> SET @.string = SUBSTRING( @.search_string, @.pos, 1 )
> -- Check we're not at the end of the string
> IF @.word > @.words BREAK
> IF @.pos = @.total_len
> BEGIN
> SET @.start = @.old_pos
> SET @.len = @.pos - @.old_pos + 1
> BREAK
> END
> -- Found a space; any previous text is the first word, any further tex
t
> is further word
> IF @.string = @.delimiter
> BEGIN
> IF @.words = @.word
> BEGIN
> -- Current word is same as requested word
> SET @.start = @.old_pos
> SET @.len = @.pos - @.old_pos
> BREAK
> END
> ELSE
> BEGIN
> SET @.old_pos = @.pos + 1
> END
> SET @.words = @.words + 1
> END
> -- Debuggin
> --SELECT @.pos AS '@.pos', @.string AS '@.string', @.start AS '@.start',
> @.old_pos AS '@.old_pos', @.word AS '@.word', @.words AS '@.words', @.len AS '@.le
n'
> -- Increment current position
> SET @.pos = @.pos + 1
> END
>
> -- Set return value
> SET @.return = SUBSTRING( @.search_string, @.start, @.len )
>
> function_exit:
> -- RETURN @.return
> SELECT @.return, LEN( @.return )
> SET NOCOUNT OFF
>|||Use a table of sequential numbers for this purpose instead of loop routines.
You can create one easily using the hack:
SELECT TOP 2000 IDENTITY ( INT ) n INTO Nbrs
FROM sysobjects s1, sysobjects s2 ;
Now your requirement is simpler:
DECLARE @.search_string VARCHAR( 2000 )
DECLARE @.delimiter CHAR( 1 )
SET @.search_string = 'This is a test function'
SET @.delimiter = ' '
SELECT SUBSTRING( @.search_string, n, CHARINDEX( @.delimiter,
@.search_string + @.delimiter, n ) - n )
FROM Nbrs
WHERE SUBSTRING( @.delimiter + @.search_string, n, 1 ) = @.delimiter
AND n < LEN( @.search_string ) + 1 ;
Anith

Friday, February 17, 2012

Extracting and converting text fields.

Hi All
I have a bug where a text field is saving text to a limit of 256 characters,
which is no where near enough. The table structure is an ID with an
identity Int, and a field called responseText.
I cannot run an alter table as "alter table tempresponses alter column
responsetext varchar(5000)"
Because "Cannot alter column 'responsetext' because it is 'text'."
I have tried to add a temporary column, but this makes it worse:
Alter table tempresponses add temp_column varchar(5000) --Add temp
column
update tempresponses set temp_column = cast(ResponseText as varchar) --
Fill out. Size irrelevant as is max 256 currently
alter table tempresponses drop column ResponseText -- Drop the TEXT
alter table tempresponses add ResponseText varchar (5000) -- read as
varchar
update tempresponses set ResponseText=temp_column
alter table tempresponses drop column temp_column
But this doesn't seem to work because it promptly crops the data at 30 chars
on doing the cast. I cannot see why. I got this solution from the
Microsoft Online Books.
Is there anything else I can try?
I have two tables that are doing this currently, and people are going to be
losing data if I cant figure this out. Shutting the database down is about
my only option at this stage.
Help.
Many Thanks
Ashley> I have a bug where a text field is saving text to a limit of 256
> characters,
Are you sure that is all that is being saved? Or are you relying on Query
Analyzer in its default configuration and counting the characters that it
returns? Try SELECT DATALENGTH(column) to see if the length really is
255/256.
This is a very common symptom and *almost always* it is a mis-diagnosis
because of http://www.aspfaq.com/2272.
If the data really isn't getting into the table, then it is usually either a
stored procedure @.param declaration issue, where the defined size is
VARCHAR(x), or a problem in the application that is generating the call to
do the insert.|||Yeah - the stored procedure that initially padded the table was varchar so I
fixed that instead. I did wonder for a while if it was Query Analyser.
The mis-diagnosis info was very useful and I have put into a FAQ db that I
use for my own references. Sweet, thanks
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:uQX11p8kFHA.4000@.TK2MSFTNGP12.phx.gbl...
> Are you sure that is all that is being saved? Or are you relying on Query
> Analyzer in its default configuration and counting the characters that it
> returns? Try SELECT DATALENGTH(column) to see if the length really is
> 255/256.
> This is a very common symptom and *almost always* it is a mis-diagnosis
> because of http://www.aspfaq.com/2272.
> If the data really isn't getting into the table, then it is usually either
> a stored procedure @.param declaration issue, where the defined size is
> VARCHAR(x), or a problem in the application that is generating the call to
> do the insert.
>|||On Fri, 29 Jul 2005 11:04:37 +1200, "amb" <@.> wrote:
(snip)
Hi Ashley,
Aaron already replied to your main question, but he didn't comment on
this part:

>update tempresponses set temp_column = cast(ResponseText as varchar) --
(snip)
>But this doesn't seem to work because it promptly crops the data at 30 char
s
>on doing the cast. I cannot see why.
If varchar (or char) is specified without length, it will in most cases
default to a length of 1 character. The only exception is in a CAST
function: for reasons beyond me, the default is set to 30 characters in
this context.
That's why the statement above crops your data to 30 characters.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)