Showing posts with label client. Show all posts
Showing posts with label client. Show all posts

Tuesday, March 27, 2012

DB2 driver/connection issue

Hi guys,

I have a kind of weird situation that may be causing this problem. My setup is that I'm remote desktopping into a 64-bit box on a client's network. They installed SSIS there, I use login 1 which has rights to the box, but none of the databases. To do that I run SSIS as login 2, who can hit the databases.

I've got two datasources for this project one is DB2 and the other is SqlServer2k5. I've been focusing on the DB2 (I'm running the Microsoft OLEDB Provider for DB2 that was part of the feature pack from feb 2007). I've registered the dll with "regsvr32.exe /s /c ibmdadb2.dll" and tried to import database settings from a pdb with "db2cfimp db2Settings.pdb". The connection to, we'll call it, DB2A doesn't actually show up in the "Admin Tools\ODBC Data Source Administrator", but seeing as how the connection sort of works I don't think this is the problem.

Using the "IBM OLE DB Provider for DB2 Servers" connection I can connect to the DB2 database in SSIS's Data Flow with a DataReader Source. The correct columns are shown in the column mappings and everything looks good while designing.

Upon executing the package I get the following error:

SSIS package "Package.dtsx" starting.
Information: 0x4004300A at TGMV019, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0047062 at TGMV019, GM19 [31]: System.InvalidOperationException: The 'IBMDADB2.1' provider is not registered on the local machine.
at Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSConnectionManager90.AcquireConnection(Object pTransaction)
at Microsoft.SqlServer.Dts.Pipeline.DataReaderSourceAdapter.AcquireConnections(Object transaction)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostAcquireConnections(IDTSManagedComponentWrapper90 wrapper, Object transaction)
Error: 0xC0047017 at TGMV019, DTS.Pipeline: component "GM19" (31) failed validation and returned error code 0x80131509.
Error: 0xC004700C at TGMV019, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at TGMV019: There were errors during task validation.
SSIS package "Package.dtsx" finished: Failure.

The problem sure looks like it's with the IBMDADB2.1 provider. I have retried the regsvr32 command (is this just a 32bit version?) to no avail. Does this assignment require a reboot (because they installed this on a training database box that really can't be easily rebooted)?

Thanks for any ideas, I'm fresh out

Jeff
Since you are using the Microsoft OLEDB Provider for DB2, why aren't you using the OLE DB Source in the data flow instead of the DataReader source?|||I gave up on using "OLE DB Provider: Microsoft OLE DB Provider for DB2" because I can't seem to get away from the error: "Test connection failed because of an error in initializing provider. The parameter is incorrect.". Also, these forums (I think) said the IBM one was more reliable than the microsoft one.

I'm not sure what to enter under: Data Link Properties. I've tried various combination for 'Initial catalog', 'package collection' and 'default schema' but none of them work. Also, pressing the "Packages" button on the bottom crashes visual studio, so I'm not sure what's supposed to go there.
|||See if this helps any:
http://www.ssistalk.com/db2_configurations.jpg

That is the list of all of the properties for one of my connections that I use on a daily basis to transfer records from DB2 to SQL Server 2005.

DB2 --> SQL Server 7 & 2000

Im supporting a client that has an AS400 (DB2 database) accounting system and has recently upgraded from SQL Server 7 to 2000. He runs all his queries through SQL Server to DB2 becuase the interface and reporting is much easier.

He has found that some of the same queries run through SQl Server 7 then through SQL Server 2000 yield different row counts. A query run directly on AS400 or through SQL Server 7 returns the correct number of rows, while the same query run through SQL Server 2000 will return a count considerably less...

1,475,234 though SQL Server 7 and 2,045 through SQL Server 2000!

Does anyone one know of any changes in the openquery structure between SQL 7 & 2000 or any changes in AS400 interaction that may account for this dataloss in SQL Server 2000?I vote for the miracle thing...

Where does the data reside.

What do you mean "run on".

How is connectivity established.

And I would use the word "perceived" in your statement...|||Do you have any type of date criteria in your query?sql

Sunday, March 11, 2012

DB recovery

Since my client's log file is going extremely large, I
DETACH the database,
delete the .LDF file, and ATTACH the database from the
original file. This
method worked in the past, but this time when I tried to
ATTACH the
database, it prompts 'ERROR: 1813, unable to create
database' something like
that (I tried to translate it since it is a Chinese SQL
server). It seems
the original .MDF file has corrupted. Now, I have with me
is the original
..MDF file, what can I do?
TonyYou shouldn't have deleted the log file in the first place. When SQL Server starts it performs
recovery for each database, where it reads through the log and synchronizes the modifications in the
log to the database. Imagine what happen if the log isn't there!
In some situations, where there is not recovery to perform, SQL Server can happily create a new log
file for you, but how would you know that this is what will happen if you delete the log file? You
can't.
Deleting the log file is an extremely unsafe method to re-claim HD space.
I suggest that you restore from the latest clean backup. That is the only way to get a consistent
database back. If that isn't an option, let MS help you though this situation as they might have
tools/commands to save what can be saved.
--
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
"Tony Lam" <anonymous@.discussions.microsoft.com> wrote in message
news:006b01c3ba1c$e534a8b0$a001280a@.phx.gbl...
> Since my client's log file is going extremely large, I
> DETACH the database,
> delete the .LDF file, and ATTACH the database from the
> original file. This
> method worked in the past, but this time when I tried to
> ATTACH the
> database, it prompts 'ERROR: 1813, unable to create
> database' something like
> that (I tried to translate it since it is a Chinese SQL
> server). It seems
> the original .MDF file has corrupted. Now, I have with me
> is the original
> ..MDF file, what can I do?
> Tony
>|||sorry to contridict you Tibor (I bow to your superior knowledge), but
detaching the database (in situations where there is only ONE logfile)
causes it to be shutdown cleanly meaning that the current logfile is
not needed for the reattach operation. I agree with you that deleting
it is perhaps best advised against until a successful reattach (simply
rename it).
Kalen mentions this tip in her Inside SQL 2K book (Chpt 5,Other
Database Considerations).
Let me know if the current thinking has changed on doing this (either
by MS or the SQL professionals).
Br,
Mark Broadbent
mcdba , mcse+i
=============|||Mark,
> sorry to contridict you Tibor
No problem, one of the best way to learn things IMO...
> (I bow to your superior knowledge),
LOL... :-)
> but
> detaching the database (in situations where there is only ONE logfile)
> causes it to be shutdown cleanly meaning that the current logfile is
> not needed for the reattach operation.
Ahh, I didn't read the OP that close. IIRC, the doc's states that you can do this if you have only
one log file *and* only one data file. And then use sp_attach_single_file_db. If this is what Tony
did, then SQL Server didn't behave as per the documentation. I agree with that. :-)
Personally, I still don't feel comfortable doing this unless I have very good backup etc to fallback
on.
--
Tibor Karaszi, SQL Server MVP
Archive at: http://groups.google.com/groups?oi=djq&as_ugroup=microsoft.public.sqlserver
"Mark Broadbent" <no-spam-please_mark.broadbent@.virgin.net> wrote in message
news:Oo0LjdmuDHA.2304@.TK2MSFTNGP12.phx.gbl...
> sorry to contridict you Tibor (I bow to your superior knowledge), but
> detaching the database (in situations where there is only ONE logfile)
> causes it to be shutdown cleanly meaning that the current logfile is
> not needed for the reattach operation. I agree with you that deleting
> it is perhaps best advised against until a successful reattach (simply
> rename it).
> Kalen mentions this tip in her Inside SQL 2K book (Chpt 5,Other
> Database Considerations).
> Let me know if the current thinking has changed on doing this (either
> by MS or the SQL professionals).
>
> --
> Br,
> Mark Broadbent
> mcdba , mcse+i
> =============|||Full Backup always a good idea @.:-)
Amen to that!
--
Br,
Mark Broadbent
mcdba , mcse+i
=============

DB Process is dead or not enabled

Hi,
I am getting the following SQL Server message in my client application after
processing volume of transaction.
Error Message
=========
10005,-1, DB Process is dead or not enabled
SQL Server Edition
============
MSDE with SP3
OS Edition
=======
Windows 2000 ProfessionalHi
Any other errors in the SQL errorlog?
"Balaji Prabhu.T" <anonymous@.discussions.microsoft.com> wrote in message
news:4972B017-04B1-451B-A8D2-D88DAF325863@.microsoft.com...
> Hi,
> I am getting the following SQL Server message in my client application
after processing volume of transaction.
> Error Message
> =========
> 10005,-1, DB Process is dead or not enabled
> SQL Server Edition
> ============
> MSDE with SP3
> OS Edition
> =======
> Windows 2000 Professional
>

Sunday, February 19, 2012

Db Logic - Cant put it back together

Hi there!

Hope somebody got some better insight into this problem than i have. I'm struggling with some db logic.

Overview:

I have a client who is a publisher. They would like to post all their advert-size specs for each magazine to their web site. My first thought was that ill be able to build one table and populate it with the information for each magazine. I was wrong. Each magazine comes in a different size. Thus each magazine will have different advert-sizes as well.

Layout for a magazine will look something like this:

Size Trim Type Bleed

Full Page 280x440 270x430 290x450

Half Page 140x220 130x210 150x230

etc...

Some mags will not have values for Half pages since they dont print half pages and others will not have specs for Bleed.

Because of this - as an easy way out I created a table per magazine. It works but i dont think its very smart.

Break it down!

Ok so what im trying as a solution is to have three tables. The 1st table will hold the magID and Size values (e.g Half Page). The 2nd table will also have the magID and the Trim, Type, Bleed info. The 3rd table holds magID, sizeID, specID and the actual value (140x220).

I thought that this would be better because within these three tables i can store the information for each magazine regardless of their differences. Brilliant!

The Problem.

The problem comes when i have to put it all back together again. I need to represent this data in a table so i can bind it to a datagrid. I have NO idea how to do this. What i THINK i need to have is some temp table created on the fly. The row names for this temp table will come from the 1st table. The column names will come from the 2nd table and the values for each field will come from the 3rd table bound by foreign keys.

I've somewhat managed to do this with INNER JOINS. But it doesn't give the desired result. I need to set row and column NAMES using tables 1 & 2 then populate the columns with table 3, then bind to a datagrid.

Any ideas on how i could manage this?

If you made it tis far through my question then thanks anyways! I hope you can help me out!

Ta

You'll need to use SQL Servers new PIVOT function. Assuming your tables are set up as such:

MagSize: ID, Size
MagType: ID, TypeName
MagAttribute: SizeID, TypeID, Value

You generate your desired results with:

WITH TempTable
AS (
SELECT Size, TypeName, Value FROM
MagSize s
INNER JOIN MagAttribute v ON s.ID = v.SizeID
INNER JOIN MagType t ON t.ID = v.TypeID
)
SELECT * FROM TempTable
PIVOT (
MAX(VALUE)
FOR TypeName IN ([Trim],[Type],[Bleed])
) AS PVT

MSDN Article|||

Hi northside!

Thanks for your reply!

Ok seems like im gonna be stuck since i only have access to a SQL 2000 server.

I was just wondering from your query where you used the Trim, Type, Bleed values in your second last line: Because i dont know if every magazine will have three attributes (some might only have Trim and Bleed) Is there some way i can 1) replace Trim, Type, Bleed with variables and 2) use some array maybe to generate the quantity of attributes?

See I recon I can do this if I can create DDL based on the results created by the SELECT query. But my logic tells me ill need to store results in an array and i dunno if t-sql caters for that. So far i couldn't find anything.

Thanks once again!

|||

You can still do pivot queries in SQL Server 2000, it's just a bit more convoluted. The query below does the equivalent to the above:

SELECT s.Size,

MAX(CASE WHEN TypeName = 'Trim' THEN Value END) Trim,

MAX(CASE WHEN TypeName = 'Type' THEN Value END) Type,

MAX(CASE WHEN TypeName = 'Bleed' THEN Value END) Bleed

FROM

MagSize s

INNER JOIN MagAttribute v ON s.ID = v.SizeID

INNER JOIN MagType t ON t.ID = v.TypeID

GROUP BY Size

Unfortunately, neither the above query and SQL Server 2005s pivot function are dynamic - you must know in advance what columns you want to pivot. If you want a more dynamic report, you have to write an ad-hoc query. Something like the one below should hopefully get you started:

CREATE PROCEDURE [dbo].[MagazinePivot]

@.ID int

AS

BEGIN

SET NOCOUNT ON;

DECLARE @.pivot VARCHAR(2000)

SELECT @.pivot = COALESCE(@.pivot + ', ', '') + 'MAX(CASE WHEN TypeName = '''+ TypeName + ''' THEN Value END) ' + TypeName

FROM MagType WHERE ID IN (

SELECT TypeID FROM MagAttribute WHERE SizeID = @.ID

)

DECLARE @.sql VARCHAR(2000)

SET @.sql = '

SELECT s.Size, ' + @.pivot + '

FROM

MagSize s

INNER JOIN MagAttribute v ON s.ID = v.SizeID

INNER JOIN MagType t ON t.ID = v.TypeID

GROUP BY Size'

EXEC (@.sql)

END

|||

northside!

Dude quality post!

Gonna take me a while to work through this one though.

Thanks bud