Search Results

Search found 58965 results on 2359 pages for 'ssis data transformations'.

Page 14/2359 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • What does error 0xC02020C4 mean in SSIS?

    I get this error with this description. Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

    Read the article

  • Modify the Event Log Source name for an SSIS package

    - by Paul Kohler
    I have an SSIS package that logs to the Event Log (yes, the event log!) The default "Source" of the log events is "SQLISPackage100" but I want it to be something like "AppName". Event Type: Error Event Source: SQLISPackage100 Event Category: None ... Description: Package "Foo" failed. I hope the answer is simple, but does anyone know how to change the text of the Event Log Source?

    Read the article

  • SSIS package randomly hangs during execution

    - by Adam MacLeod
    Hi Guys, I am having an ongoing and painful problem with an SSIS package. The package runs every 5 minutes as an SQL Agent Job and every 2-10 days the package will start running and never stop (thus preventing further executions). If I stop the hung job manually it will begin working perfectly again in the next 5 minute interval. The SSIS package is for moving data from an Oracle database to a MSSQL 2005 database. It has 7 steps: Step 1 calls an Oracle Stored Procedure to prepare the temporary tables inside ORACLE Steps 2-6 process the data from the ORACLE tables to the MSSQL tables ORACLE - MSSQL Step 7 calls an Oracle Stored Procedure to clear the ORACLE temporary tables I suspect that the issue is caused by a communications error between the MSSQL server and the ORACLE server. Both the MSSQL database and Agent/package run on one machine with the ORACLE database running over the network. I have enabled logging of the SQL package and after more than 2GB of log file I have captured the instant where the package stops responding: OnPreValidate,ADV-SRV5,NT AUTHORITY\SYSTEM,CallistaIntegrationToMonashCRM_delta,{F88F6C45-CFA2-4801-A2F2-DDF03D458A48},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,(null) OnPreValidate,ADV-SRV5,NT AUTHORITY\SYSTEM,Address,{c5907799-f918-43da-818a-d4bd7f188367},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,(null) OnInformation,ADV-SRV5,NT AUTHORITY\SYSTEM,Address,{c5907799-f918-43da-818a-d4bd7f188367},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,1074016266,0x,Validation phase is beginning. OnProgress,ADV-SRV5,NT AUTHORITY\SYSTEM,Address,{c5907799-f918-43da-818a-d4bd7f188367},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,Validating Diagnostic,ADV-SRV5,NT AUTHORITY\SYSTEM,Callista,{cb5d6fe3-3ea4-4453-8e5a-965818021df7},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,ExternalRequest_pre: The object is ready to make the following external request: 'IDataInitialize::GetDataSource'. Diagnostic,ADV-SRV5,NT AUTHORITY\SYSTEM,Callista,{cb5d6fe3-3ea4-4453-8e5a-965818021df7},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,ExternalRequest_post: 'IDataInitialize::GetDataSource succeeded'. The external request has completed. Diagnostic,ADV-SRV5,NT AUTHORITY\SYSTEM,Callista,{cb5d6fe3-3ea4-4453-8e5a-965818021df7},{3A1FB1E3-B76D-444D-876B-D1FBBB9BA246},6/06/2010 10:15:01 AM,6/06/2010 10:15:01 AM,0,0x,ExternalRequest_pre: The object is ready to make the following external request: 'IDBInitialize::Initialize'. These messages show the entire log generated for the failed run, for a successful run the output is typically ~2500 lines. I can see that the package is hanging during the initialize operation on the Callista connection (ORACLE database). I have not been able to work out a way to either fix this issue or have the package die gracefully (an error to the log would be A-OK with me). Any help or advice would be greatly appreciated.

    Read the article

  • SSIS Data Flow Task SQL 2008

    - by Gerard
    Hi All, I am wondering if it is possible to: 1) Develop SSIS Package for Data Flow Task I am aware of how to do this on a local or network SQLServer, However is it possible to create a package that uploads to a "remote" sqlserver, ie one that is not on site or on the LAN. any guidance would be great Thanks

    Read the article

  • SSIS flat file insertion failure to rollback

    - by Pramodtech
    I have simple SSIS package which reads data from flat file and insert into SQL database. The file has 90K rows and sometimes because of bad data package fails but it insert the partial records before it fails. What I need is if insertion fails at any time between, no records should be inserted into DB, rollback everything. how can I put it in transaction?

    Read the article

  • Error importing SSIS package with Konesans File system Watcher task into SQL Server 2008

    - by Craig HB
    I am importing SSIS packages to SQL Server 2008 that were originally built for SQL Server 2005. I upgraded them in VS2008 and them imported them. They all import and work except for the one with the Konesans File system Watcher task. I installed in the setup exe for Konesans File system Watcher SQL Server 2008 on my dev pc and the production server, but still get this error: Exception from HRESULT: 0xC0010026 (Microsoft.SqlServer.DTSRuntimeWrap) Any advice?

    Read the article

  • SSIS global variable

    - by Pramodtech
    Is there anything similar to global variable in SSIS? I have 4 variables (FromAddress, ToAddress,...) which will be used in all packages (32). So if I can set them only once it will be very easy to use in all packages and will save my time. Please advise.

    Read the article

  • SSIS - 'Execute SQL' Task and Record Sets

    - by Mick Walker
    Hi, How can I access a 'RecordSet' within a 'Execute SQL' task when using SSIS? I have looked at the parameter mapping options within the Execute SQL Task Editor and cannot find a type of object to allow me to pass the variable holding my record set to the task.

    Read the article

  • Customizing Mail Message in SSIS Event Handler

    - by Eric Ness
    I want to add an email notification to an SSIS 2005 package event handler. I've added a Send Mail task to the event handler. I'd like to customize the email body to include things like the error description. I've tried including @[System::ErrorDescription] in the MessageSource field, but the mail message doesn't include the value of ErrorDescription only the name of the variable.

    Read the article

  • Tracking down data load performance issues in SSIS package

    - by SteveC
    Are there any ways to determine what the differences in databases are that affect a SSIS package load performance ? I've got a package which loads and does various bits of processing on ~100k records on my laptop database in about 5 minutes Try the same package and same data on the test server, which is a reasonable box in both CPU and memory, and it's still running ... about 1 hour so far :-( Checked the package with a small set of data, and it ran through Ok

    Read the article

  • OnTaskFailed event handler in SSIS

    - by Jason M
    If I use OnError event handler in my SSIS package, there are variables System::ErrorCode and System::ErrorDescription from which I can get the error information if any things fails while execution. But I cant the find the same for OnTaskFailed event handler, i.e. How to get the ErrorCode and ErrorDescription from the OnTaskFailed event handler when any things fails while execution in case we want to only implement OnTaskFailed event handler for our package?

    Read the article

  • SSIS Null Value Questions

    - by Saobi
    I have a table with 5 string columns, all can be NULLs. After I read the data from this table, I want to convert any null values into empty strings. The reason is that I need to compare these columns with columns in another table of the same schema (using conditional split), and null values would cause the comparison to evaluate to NULL. Is there any functionality in SSIS that allows me to convert NULL's to empty strings, or just not having to deal with NULL's at all?

    Read the article

  • SSIS Permissions issue

    - by Dave
    Hi All, How can we set permissions for users to only allow them to download SSIS packages from the production server and but deny them permissions to run any package in the Server. http://msdn.microsoft.com/en-us/library/ms141053(SQL.90).aspx If i assign users to any of the DB roles db_dtsadmin, db_dtsltduser, and db_dtsoperator they will automatically have permission to run the package. Appreciate your inputs. Thanks!

    Read the article

  • Best tutorial to learn SSIS

    - by Nabin
    Hi, Which book is best to learn SSIS. Actually in my project we need to take onput from CVS file and after processing the data in SQL server 2008 we have export it back to excel file. ASP.NET is used as UI for this. Thanks, Nabin

    Read the article

  • Recommend ONE favorite SSIS component that does SFTP/FTPS

    - by Kevin Fairchild
    Sometimes normal FTP doesn't quite cut it... When you need to do secure FTP via SSIS packages, what ONE product would you recommend? Before answering, please see if someone has already suggested the same thing and, if so, vote it up. NOTE: Ideally, it needs to handle both SSH and SSL FTP connections, but I'd consider two separate components if it makes the most sense....

    Read the article

  • Why doesn't SSIS ftp task receive file?

    - by Mark
    I'm running an FTP task inside of SSIS to receive a file and the task executes successfully yet no file is returned to the local folder that I specified. Where did the file go? How can I make the FTP task download a file to the location that I need it at?

    Read the article

  • SSIS to copy data from one table to another, where not in destination table

    - by alex
    I'm in the process of creating an SSIS package on a server (server1) that looks at the data in a sql db on another site (server2) and copies relevant rows across. The SQL statement required is: SELECT * FROM server2.ordersTable WHERE OrderID Not In (SELECT OrderID FROM server1.ordersTable This selects data from server1 which isn't in the table on server2 (based on order id) I then need to insert the result into a table on server1 How would I approach this? What components do I need etc...?

    Read the article

  • New Feature in ODI 11.1.1.6: ODI for Big Data

    - by Julien Testut
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} By Ananth Tirupattur Starting with Oracle Data Integrator 11.1.1.6.0, ODI is offering a solution to process Big Data. This post provides an overview of this feature. With all the buzz around Big Data and before getting into the details of ODI for Big Data, I will provide a brief introduction to Big Data and Oracle Solution for Big Data. So, what is Big Data? Big data includes: structured data (this includes data from relation data stores, xml data stores), semi-structured data (this includes data from weblogs) unstructured data (this includes data from text blob, images) Traditionally, business decisions are based on the information gathered from transactional data. For example, transactional Data from CRM applications is fed to a decision system for analysis and decision making. Products such as ODI play a key role in enabling decision systems. However, with the emergence of massive amounts of semi-structured and unstructured data it is important for decision system to include them in the analysis to achieve better decision making capability. While there is an abundance of opportunities for business for gaining competitive advantages, process of Big Data has challenges. The challenges of processing Big Data include: Volume of data Velocity of data - The high Rate at which data is generated Variety of data In order to address these challenges and convert them into opportunities, we would need an appropriate framework, platform and the right set of tools. Hadoop is an open source framework which is highly scalable, fault tolerant system, for storage and processing large amounts of data. Hadoop provides 2 key services, distributed and reliable storage called Hadoop Distributed File System or HDFS and a framework for parallel data processing called Map-Reduce. Innovations in Hadoop and its related technology continue to rapidly evolve, hence therefore, it is highly recommended to follow information on the web to keep up with latest information. Oracle's vision is to provide a comprehensive solution to address the challenges faced by Big Data. Oracle is providing the necessary Hardware, software and tools for processing Big Data Oracle solution includes: Big Data Appliance Oracle NoSQL Database Cloudera distribution for Hadoop Oracle R Enterprise- R is a statistical package which is very popular among data scientists. ODI solution for Big Data Oracle Loader for Hadoop for loading data from Hadoop to Oracle. Further details can be found here: http://www.oracle.com/us/products/database/big-data-appliance/overview/index.html ODI Solution for Big Data: ODI’s goal is to minimize the need to understand the complexity of Hadoop framework and simplify the adoption of processing Big Data seamlessly in an enterprise. ODI is providing the capabilities for an integrated architecture for processing Big Data. This includes capability to load data in to Hadoop, process data in Hadoop and load data from Hadoop into Oracle. ODI is expanding its support for Big Data by providing the following out of the box Knowledge Modules (KMs). IKM File to Hive (LOAD DATA).Load unstructured data from File (Local file system or HDFS ) into Hive IKM Hive Control AppendTransform and validate structured data on Hive IKM Hive TransformTransform unstructured data on Hive IKM File/Hive to Oracle (OLH)Load processed data in Hive to Oracle RKM HiveReverse engineer Hive tables to generate models Using the Loading KM you can map files (local and HDFS files) to the corresponding Hive tables. For example, you can map weblog files categorized by date into a corresponding partitioned Hive table schema. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Using the Hive control Append KM you can validate and transform data in Hive. In the below example, two source Hive tables are joined and mapped to a target Hive table. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} The Hive Transform KM facilitates processing of semi-structured data in Hive. In the below example, the data from weblog is processed using a Perl script and mapped to target Hive table. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Using the Oracle Loader for Hadoop (OLH) KM you can load data from Hive table or HDFS to a corresponding table in Oracle. OLH is available as a standalone product. ODI greatly enhances OLH capability by generating the configuration and mapping files for OLH based on the configuration provided in the interface and KM options. ODI seamlessly invokes OLH when executing the scenario. In the below example, a HDFS file is mapped to a table in Oracle. Development and Deployment:The following diagram illustrates the development and deployment of ODI solution for Big Data. Using the ODI Studio on your development machine create and develop ODI solution for processing Big Data by connecting to a MySQL DB or Oracle database on a BDA machine or Hadoop cluster. Schedule the ODI scenarios to be executed on the ODI agent deployed on the BDA machine or Hadoop cluster. ODI Solution for Big Data provides several exciting new capabilities to facilitate the adoption of Big Data in an enterprise. You can find more information about the Oracle Big Data connectors on OTN. You can find an overview of all the new features introduced in ODI 11.1.1.6 in the following document: ODI 11.1.1.6 New Features Overview

    Read the article

  • Data recovery on a data HDD (no OS)

    - by aCuria
    I am helping a family member with a dead hard disk. It is a seagate 200Gb 3.5" HDD in one of those old-school external enclosures. The problem was that windows failed to detect the hard disk when plugged in through USB. I removed the hard disk from its enclosure, and plugged it into my desktop PC. The BIOS does detect it upon POST, but unfortunately windows 7 would refuse to boot. It will get stuck on the loading screen with the glowing windows logo. Safe mode doesn't help either. What options do I have before going for some professional data recovery? edit: Someone modified the Title to something completely different from what I was asking, i just changed it back. 1) 2 HDD drives, DiskA(Dead), DiskB(my OS disk) 2) when B is connected to my system, everything works fine 3) when A AND B is connected, failure to boot. POSTs fine, but windows wont load 4) A has NO OS, its PURE data. It came from an EXTERNAL HDD enclosure which doesnt belong to me, and im trying to do data recovery.

    Read the article

  • Design of input files reading when it comes to defaults/transformations

    - by Stefano Borini
    Suppose you have an application that reads an input file, on a language that does not support the concept of None. The input is read, parsed, and the contents are stored on a structure for later use. Now, in general you want to keep into account transformation of the data from the input, such as adding default values when not specified, or adding full path information to relative path specified in the input. There are two different strategies to achieve this. The first strategy is to perform these transformations at input file reading time. In practice, you put all the intelligence into the input parser, and your application has no logic to deal with unexpected circumstances, such as an unspecified value. You lose the information of what was specified and what wasn't, but you gain in black-boxing the details. Your "running code" needs that information in any case and in a proper form, and is not concerned if it's the default or a user-specified information. The second strategy is to have the file reader a real one-to-one mapper from the file to a memory-stored object, with no intelligent behavior. unspecified values are not filled (which may however be a problem in languages not supporting None) and data is stored verbatim from the file. The intelligence for recovery must now go into the "running code", which must check what was specified in the file, eventually fall back to a default, or modify the input properly before using it. I would like to know your opinion on these two approaches, and in particular which one you found the most frequently implemented.

    Read the article

  • Validate data before uploading through SSIS

    - by The King
    I have a SSIS package to upload data from Excel file into an Sql Server 2005 table. The excel file will have varied lines of data ranging from 20k - 30k lines. The upload works fine, when all the data are correct. But obviously fails when there is a small problem even in a single row. Examples like mandatory values presented null, inconvertable values (data type mismatch) etc. I want to validate the excel file before the upload and want to tell the user which row and column has got the error... Any idea as to how to accomplish this, without consuming much time and resources. Thanks

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >