Showing posts with label image. Show all posts
Showing posts with label image. Show all posts

Thursday, March 29, 2012

Failed to retrieve long data for column

Hi,

We have a SSIS project where we load lot of image files using an OLE DB Source component and put it across the database as varbinary into OLEDB Destination component.

Things were fine until now where am getting an error like this. alongside the log also reads like,

There is not enough space on the disk. Long data was retrieved and can't be added to the data flow task buffer.

SSIS Error Code: DTS_E_InducedTransformFailureOnError.

Is this method of loading the files using an OleDb Souce not the efficient one ? If not is there a way I can get this done. Comments are highly appreciated!!

Thanks in Advance

Have you check to see if you are running out of physical disk space?

Unless you have a directory specified in the BLOBTempStoragePath property of the data flow, the default location will be the directory specified by the TEMP environment variable.

Make sure that the directory listed by either of those isn't running out of disk space. If so, then set BLOBTempStoragePath to a directory containing more disk space.|||

Hi Phil,

Thanks for the valuable inputs,Is it ok if I just point a Directory in a Physical drive that has enough disk space is it? What purpose does this BlobStorage Serve ?

|||

MShetty wrote:

Hi Phil,

Thanks for the valuable inputs,Is it ok if I just point a Directory in a Physical drive that has enough disk space is it? What purpose does this BlobStorage Serve ?

The BLOB storage path is used to hold the contents of a varbinary field if it cannot fit into memory.

So point that parameter to a directory that contains enough disk space and try it again.|||Thanks Phil.. It worked Smile|||I got the same message. Failed to retrieve long data for column I was running an ssis job to import tables from an access database to sql2000 so we could change the back end to sql. I had everything working ok. Then I had to change the source connection from the dev access back end to the production access back end so i could get the latest data. One field in one of 83 tables kept blowing up. It was an Ntext field. I tried everything in these posts to no avial. Finally I noticed that the size of the production mdb was 3x the size of my dev mdb. So, I compacted the production mdb and reran the ssis package. It worked like a charm. Hope this may help someone else out there.
tom in stumptown

Failed to retrieve long data for column

Hi,

We have a SSIS project where we load lot of image files using an OLE DB Source component and put it across the database as varbinary into OLEDB Destination component.

Things were fine until now where am getting an error like this. alongside the log also reads like,

There is not enough space on the disk. Long data was retrieved and can't be added to the data flow task buffer.

SSIS Error Code: DTS_E_InducedTransformFailureOnError.

Is this method of loading the files using an OleDb Souce not the efficient one ? If not is there a way I can get this done. Comments are highly appreciated!!

Thanks in Advance

Have you check to see if you are running out of physical disk space?

Unless you have a directory specified in the BLOBTempStoragePath property of the data flow, the default location will be the directory specified by the TEMP environment variable.

Make sure that the directory listed by either of those isn't running out of disk space. If so, then set BLOBTempStoragePath to a directory containing more disk space.|||

Hi Phil,

Thanks for the valuable inputs,Is it ok if I just point a Directory in a Physical drive that has enough disk space is it? What purpose does this BlobStorage Serve ?

|||

MShetty wrote:

Hi Phil,

Thanks for the valuable inputs,Is it ok if I just point a Directory in a Physical drive that has enough disk space is it? What purpose does this BlobStorage Serve ?

The BLOB storage path is used to hold the contents of a varbinary field if it cannot fit into memory.

So point that parameter to a directory that contains enough disk space and try it again.|||Thanks Phil.. It worked Smile|||I got the same message. Failed to retrieve long data for column I was running an ssis job to import tables from an access database to sql2000 so we could change the back end to sql. I had everything working ok. Then I had to change the source connection from the dev access back end to the production access back end so i could get the latest data. One field in one of 83 tables kept blowing up. It was an Ntext field. I tried everything in these posts to no avial. Finally I noticed that the size of the production mdb was 3x the size of my dev mdb. So, I compacted the production mdb and reran the ssis package. It worked like a charm. Hope this may help someone else out there.
tom in stumptown

Tuesday, March 27, 2012

Failed to manage image field

In my simple table in MSDE 2000A, I have an image field. I have enabled 'text in row' using the stored procedure 'sp_tableoption'.

How do I get jpeg images in the table, and how do I read them?

Now, I ask a naive question. Has anybody ever captured an image from the file system, and saved it in SQL 2000 (or MSDE 2000A, which I am using for development)?

If you don't save image direct to database, how do you manage yor images stored seperate from the database (i.e on the filesystem)

For the last two days, I have searched the net and I regret to say I am not making any headway. Lots of stuff on the net have not been very useful.

Please help. How did you do yours? If it matters, I am using VB .NET 2003 front-end. Let me know whatever method you used.

I am desperate.

|||

Finally done!

I used the MemoryStream conversion to Byte then saved to table. To retrieve, I used the converse.

Friday, March 23, 2012

Failed to create snapshot while bulk copy large table with TEXT/IMAGE field

Dear all,

Please help us on the snapshot bulk copy issue with a TEXT
field. The snapshot agent was failed with below message:

The process could not bulk copy into table '"Enclosure"'

We have a big table about 170,000 records that needs to
replicate to other sites. There is a TEXT field in this
table. Here is the table schema..

CREATE TABLE [dbo].[Enclosure] (
[Identifier] [uniqueidentifier] NOT NULL ,
[sCommentId] [varchar] (15) COLLATE
Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[sEncName] [varchar] (50) COLLATE
Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[sUserId] [varchar] (15) COLLATE
Chinese_Taiwan_Stroke_CI_AS NOT NULL ,
[dUpdateTime] [datetime] NOT NULL ,
[sContent] [text] COLLATE Chinese_Taiwan_Stroke_CI_AS
NULL ,
[msrepl_tran_version] [uniqueidentifier] NOT NULL ,
[rowguid] uniqueidentifier ROWGUIDCOL NOT NULL
) ON [EXT_1] TEXTIMAGE_ON [EXT_1]

The "Identifier" field is an auto-generated primary key
(by newid()) of this table. The table is built on SQL
server 2000 SP3 over Windows 2000 sever SP4.

For this problem, I've searched in the news group and
google for the solution. I would summarize them as below:

1. Check disk space:
Our disk space should be enough. We have 3GB in system
partition and 120GB in data partition. All snapshot and
database file are located on the data partition. The data
file of this database that need to replicate is only 170MB.

2. Try BCP.exe command line
We tried to execute
[ bcp "select * from repl_test.dbo.Enclosure" queryout
snapshot.txt ]
in the command line. It can finish successfully but it
showed that it only bulk copy 36739 records out of 170,000
records. We do not have any idea about this result.

3. Make the TEXT field to be non-NULL
We fill an empty character ('') for the NULL value in the
TEXT field but the snapshot still cannot create
successfully.

4. Change configuration "max text repl size (B)" with
sp_configure
We check the maximum length of the TEXT field by [ select
max(datalength(sContent)) from Enclosure ]. Then, we
use "sp_configure" to set the "configure_value" of "max
text repl size (B)" entry to be 2 times of the maximum
length of the TEXT field. However, it shill cannot work!

Above suggestions did not work in our problem. The
snapshot still cannot create successfully. Finally I try
to modify the data type of TEXT field to be VARCHAR(7500)
then it can create snapshot successfully. Although it can
work, it's not really what we need.

I also tried to reduce the row number to see if bulk copy
has row limitation. When the row number become smaller
(about 10,000 records), the snapshot can create
successfully with the TEXT field.

I am not sure it's a bug or limitation of SQL server
replication. Could anybody help us on this urgent issue?
We are exhausted on this problem :~~. We need your advice
about this issue.

Thanks for your help!!

Best regards,
Louis
.Howdy

Check the disk space for the Distribution database - if its small ( monitor space taken for 5000 rows then 10000 rows ) , try moving it to a bigger drive if need be.

You may also want to check the snap shot replication folder location - again it could be a space issue.

See:
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/replsql/replplan_1l4e.asp

How to specify alternate snapshot locations (Enterprise Manager):

In SQL Server Enterprise Manager, expand the Replication and Publications directory, select a publication, right-click the publication, and then click Properties.

On the Snapshot Location tab, select Generate snapshots in the following location option, and then type a Universal Naming Convention path or click the browse button (...) and browse for the location where you want to save snapshot files.

Cheers

SG|||HI SG,

Thanks for your answer. I just adjust the file location of "distribution" and snapshot folder to a 120GB free disk partition. The source database is only 170MB. Beside this, the winnt\temp directory has 3GB free in system partition. I think the disk space should be OK.

I just tried the bcp.exe command again. I found it only can bulk copy 36188 records out of original 170000 records. All data length of the TEXT file and total data rows are under snapshot agent's defaults.

bcp command:

bcp repl_test..Enclosure out enclosure.bcp /Usa

Defaults of snapshot agent :

M i c r o s o f t S Q L S e r v e r S n a p s h o t A g e n t 8 . 0 0 . 7 6 0
C o p y r i g h t ( c ) 2 0 0 0 M i c r o s o f t C o r p o r a t i o n
M i c r o s o f t S Q L S e r v e r R e p l i c a t i o n A g e n t : C R M 2 - r e p l _ t e s t - r e p l _ t e s t - 1 1

S e r v e r :

D B M S : M i c r o s o f t S Q L S e r v e r
V e r s i o n : 0 8 . 0 0 . 0 7 6 0
u s e r n a m e : d b o
A P I c o n f o r m a n c e : 2
S Q L c o n f o r m a n c e : 1
t r a n s a c t i o n c a p a b l e : 2
r e a d o n l y : N
i d e n t i f i e r q u o t e c h a r : "
n o n _ n u l l a b l e _ c o l u m n s : 1
o w n e r u s a g e : 3 1
m a x t a b l e n a m e l e n : 1 2 8
m a x c o l u m n n a m e l e n : 1 2 8
n e e d l o n g d a t a l e n : Y
m a x c o l u m n s i n t a b l e : 1 0 2 4
m a x c o l u m n s i n i n d e x : 1 6
m a x c h a r l i t e r a l l e n : 5 2 4 2 8 8
m a x s t a t e m e n t l e n : 5 2 4 2 8 8
m a x r o w s i z e : 5 2 4 2 8 8|||Additional information for the bcp.exe. I think I find the way to solve my problem.

First, I executed

bcp repl_test..Enclosure out enclosure.bcp /Usa

It only can bulk copy 36188 records out of original 170000 records. So I try non-text native bulk copy with /N switch of bcp.exe

bcp repl_test..Enclosure out enclosure.bcp /Usa /N

With non-text native bulk copy, it can copy all 170000 records successfully. Then, I modify snapshot settings in Enterprise manager to use non-text native mode to create snapshot. The snapshop then can finished successfully!! :)

I am not sure if any side effect to use non-text native snapshot creation for TEXT and IMAGE field. I will double check the result and post it for everyone that interested on this issue.

Thanks for your concern and help!!

Louis|||Hello louisyfwang,

With bcp you can set the batch size ( -b option) so that it commits blocks of rows, and breakes the bulk into multiple transaction instead of attempting to transfer all at once. Essence this is mainly done in large size tables.

Hope this helps.

Jose R.

Originally posted by louisyfwang
Additional information for the bcp.exe. I think I find the way to solve my problem.

First, I executed

bcp repl_test..Enclosure out enclosure.bcp /Usa

It only can bulk copy 36188 records out of original 170000 records. So I try non-text native bulk copy with /N switch of bcp.exe

bcp repl_test..Enclosure out enclosure.bcp /Usa /N

With non-text native bulk copy, it can copy all 170000 records successfully. Then, I modify snapshot settings in Enterprise manager to use non-text native mode to create snapshot. The snapshop then can finished successfully!! :)

I am not sure if any side effect to use non-text native snapshot creation for TEXT and IMAGE field. I will double check the result and post it for everyone that interested on this issue.

Thanks for your concern and help!!

Louis|||Hi Jose,

Thanks for your suggestion. I can use the option of "-b" in the bcp.exe command line. Is there any way to specify this options in snapshot agent? I cannot find any "bcp.exe" command in the agent properties.

Best regards,
Louis

Wednesday, March 7, 2012

Fact Table Design - Help

Hi There,
I have to build a Fact table, and I have this situation.
My OLTP database, keeps always the last image of the
record.
Table
Incident(IncidentId, CreationTimme, StatusId, PriorityId,
rankId, Description, ..)
My Log Database, keeps a table per field and tracks all
the changes.
Incident_CreationTime(IncidentId, CreationTime, Timestamp)
Incident_StatusId(IncidentId, StatusId, Timestamp)
Incident_PriorityId(IncidentId, PriorityId, Timestamp)
Incident_RankId(IncidentId, RankId, Timestamp)
You got the Idea.
I have to be based on the Log database to create my Fact
Table of Incident. So what's the best way to create the
fact table in this case ? What about the dimension too ?
ThanksWhat are you trying to measure and how are you measuring it "by" ?
These are the inputs to your design, not the existing schema!
"Elie Khammar" <ekhammar@.positron.qc.ca> wrote in message
news:0cd301c3bb40$ec0fb7e0$a401280a@.phx.gbl...
quote:

> Hi There,
> I have to build a Fact table, and I have this situation.
> My OLTP database, keeps always the last image of the
> record.
> Table
> Incident(IncidentId, CreationTimme, StatusId, PriorityId,
> rankId, Description, ..)
> My Log Database, keeps a table per field and tracks all
> the changes.
> Incident_CreationTime(IncidentId, CreationTime, Timestamp)
> Incident_StatusId(IncidentId, StatusId, Timestamp)
> Incident_PriorityId(IncidentId, PriorityId, Timestamp)
> Incident_RankId(IncidentId, RankId, Timestamp)
> You got the Idea.
> I have to be based on the Log database to create my Fact
> Table of Incident. So what's the best way to create the
> fact table in this case ? What about the dimension too ?
> Thanks
>
|||I'm trying to measure the number of incidents per
Statusid, PriorityId, per RankId.
quote:

>--Original Message--
>What are you trying to measure and how are you measuring

it "by" ?
quote:

>These are the inputs to your design, not the existing

schema!
quote:

>
>"Elie Khammar" <ekhammar@.positron.qc.ca> wrote in message
>news:0cd301c3bb40$ec0fb7e0$a401280a@.phx.gbl...
PriorityId,[QUOTE]
Timestamp)[QUOTE]
>
>.
>
|||You probably forgot about the date dimension. Then your fact table would
look something like:
DateID
StatusID
PriorityID
RankID
IncidentCount
And you would have four dimension tables, Date, Status, Priority, and Rank
each having the descriptive elements of those dimensions, like Date,
StatusDescription, PriorityNumber, RankNumber, or whatever is meaninful for
you for those dimensions.
"Elie Khammar" <ekhammar@.positron.qc.ca> wrote in message
news:067401c3bb4f$b6e6dee0$a001280a@.phx.gbl...[QUOTE]
> I'm trying to measure the number of incidents per
> Statusid, PriorityId, per RankId.
>
> it "by" ?
> schema!
> PriorityId,
> Timestamp)|||Hi Kevin,
Thanks for answering me back. I still have another
question for you.
It's not my first time that I develop a datawarehouse. My
only worry here, is the fact that these tables are related
together as many-to-many relationship in the OLTP system.
So If I do it like you suggested,
(IncidnetNumber, DateId, StatusId, PriorityId, RankId,
IncidentCount)
I could end up for each Incident something like this in my
FACT table:
1, 20031208, 1, 0, 0, 1
1, 20031208, 0, 1, 1, 1
1, 20031208, 4, 0, 0, 1
1, 20031208, 5, 0, 0, 1
Because the StatusId, PriorityId, RankId there is a
possibility that they change so many times during the life
cycle of an incident.
So what I thought to do, is to create a fact table for
each Fact_IncidentStatus, Fact_IncidentPriority,
Fact_IncidentRank each of these fact tables is related to
its own dimension. and then all those fact tables are
related to FACT_Incident which contains only unique
Incident Numbers.
Do you think this is a good design?
Thanks for your time
quote:

>--Original Message--
>You probably forgot about the date dimension. Then your

fact table would
quote:

>look something like:
>DateID
>StatusID
>PriorityID
>RankID
>IncidentCount
>And you would have four dimension tables, Date, Status,

Priority, and Rank
quote:

>each having the descriptive elements of those dimensions,

like Date,
quote:

>StatusDescription, PriorityNumber, RankNumber, or

whatever is meaninful for
quote:

>you for those dimensions.
>
>"Elie Khammar" <ekhammar@.positron.qc.ca> wrote in message
>news:067401c3bb4f$b6e6dee0$a001280a@.phx.gbl...
measuring[QUOTE]
message[QUOTE]
situation.[QUOTE]
all[QUOTE]
Timestamp)[QUOTE]
Fact[QUOTE]
the[QUOTE]
too ?[QUOTE]
>
>.
>
|||Elie,
It seems to me that you need to have a separate dimension
for the incident numbers.
You need:
DIM_Date
DIM_Status
DIM_Priority
DIM_Rank
DIM_Incident
And:
FACT_IncidentCount
With a surrogate key from each of your dimensions and a
count field that would contain just the number "1". As
the cube aggregated on each dimension, it would add the
count field up.
Hope this helps,
Asa Monsey
quote:

>--Original Message--
>Hi Kevin,
>Thanks for answering me back. I still have another
>question for you.
>It's not my first time that I develop a datawarehouse. My
>only worry here, is the fact that these tables are

related
quote:

>together as many-to-many relationship in the OLTP system.
>So If I do it like you suggested,
>(IncidnetNumber, DateId, StatusId, PriorityId, RankId,
>IncidentCount)
>I could end up for each Incident something like this in

my
quote:

>FACT table:
>1, 20031208, 1, 0, 0, 1
>1, 20031208, 0, 1, 1, 1
>1, 20031208, 4, 0, 0, 1
>1, 20031208, 5, 0, 0, 1
>Because the StatusId, PriorityId, RankId there is a
>possibility that they change so many times during the

life
quote:

>cycle of an incident.
>So what I thought to do, is to create a fact table for
>each Fact_IncidentStatus, Fact_IncidentPriority,
>Fact_IncidentRank each of these fact tables is related to
>its own dimension. and then all those fact tables are
>related to FACT_Incident which contains only unique
>Incident Numbers.
>Do you think this is a good design?
>Thanks for your time
>
>
>fact table would
>Priority, and Rank
dimensions,[QUOTE]
>like Date,
>whatever is meaninful for
>measuring
>message
>situation.
>all
>Timestamp)
>Fact
>the
>too ?
>.
>

facing problem to export crystal report to PDF which contains some items images

I am facing problem to export crystal report to PDF which contains some item's images.

All images change with first item image. the application developed in Visual Basic 6.0 and MS access.

The images comes to crystal report from physical location not from any database. While i m using default export utility of crystal report.See if you find solution here
support.BusinessObjects.com

Sunday, February 19, 2012

Extracting image column to the file

Hello all,
how can I extract image strored in DB in to file?
Will something like this work?
DECLARE @.cmd VARCHAR(2048)
SET @.cmd = 'bcp select history_image from recipient where recipient_id = 1
queryout c:\test\625954.jpg '
EXEC master..xp_cmdshell @.cmd, NO_OUTPUTYou want to use TextCopy.
e.g.
--OJ: TEXTCOPY example
-- Loading txt files into db
--create tb to hold data
create table tmp(fname varchar(100),txt text default '')
go
declare @.sql varchar(255),
@.fname varchar(100),
@.path varchar(50)
--specify desired folder
set @.path='c:\test'
set @.sql='dir ' + @.path + '*.txt /c /b'
--insert filenames into tb
insert tmp(fname)
exec master..xp_cmdshell @.sql
--loop through and insert file contents into tb
declare cc cursor
for select fname from tmp
open cc
fetch next from cc into @.fname
while @.@.fetch_status=0
begin
set @.sql='textcopy /s"dev" /d"tempdb" /t"tmp" /c"txt" /w"where
fname=''' + @.fname + '''"'
set @.sql=@.sql + ' /f"' + @.path + @.fname + '" /i'
print @.sql
exec master..xp_cmdshell @.sql,no_output
fetch next from cc into @.fname
end
close cc
deallocate cc
go
select * from tmp
go
drop table tmp
go
-oj
"Gene" <Gene@.discussions.microsoft.com> wrote in message
news:DDDCCF4D-23B1-4348-9740-47A7738C9E36@.microsoft.com...
> Hello all,
> how can I extract image strored in DB in to file?
> Will something like this work?
> DECLARE @.cmd VARCHAR(2048)
> SET @.cmd = 'bcp select history_image from recipient where recipient_id = 1
> queryout c:\test\625954.jpg '
> EXEC master..xp_cmdshell @.cmd, NO_OUTPUT