Search Results

Search found 9519 results on 381 pages for 'bulk import'.

Page 34/381 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • What's the simplest way to get .flac music files into iTunes?

    - by Chris Adams
    I'm looking for a simple way to import .flac files into iTunes, so I can play them on my mac, and when I'm out, my iPhone, and I'm willing to be I'm not the first person to want to do this. What are the best tools for doing this? The quality of the music is useful, but truth be told, the speakers I'd play them on are so crappy that a lot of the sound quality in the .flac format would be lost anyway, so I'm not averse to converting to mp3 files. If it helps give any context, I'm using a Macbook with OS 10.5 Leopard, and iTunes 9, connected to a 16gb iPhone with standard apple headphones, which is sometimes plugged into Bose SoundDock for music, and the music files are piano performances. C

    Read the article

  • How to transfer emails from Win XP to Vista or Windows 7?

    - by Suma
    I was using Windows XP computer for a long time, and Outlook Express as my email client. I have lots and lots of email I would like to keep and therefore I want to transfer them to my new computer with Windows 7. I have transferred all my settings using Transfer Files and Settings Wizard, however in Win 7 there is a new mail client called Windows Mail, and I cannot see any option in it to import the Outlook Express mails. How can I transfer my emails from WinXP computer to a Vista or Win7 OS?

    Read the article

  • Any way to overwrite (not merge) Outlook contacts when importing from a file?

    - by Dan
    I'm trying to create a contact list for Outlook 2010 that will contain contact information for every person in my company. I intend on keeping the list current, which means I will be manually adding new employees to the contact list, and removing contacts who no longer work here. The contact list will reside in its own subfolder within the Outlook Contacts folder. I want to periodically export this contact list as a .csv file, and allow the other employees in the company to import it into Outlook on their own computer, thus providing them with a comprehensive and up-to-date company contact list. The problem is, Outlook 2010 only wants to merge contact lists, not overwrite them. This means that any contacts who are no longer with the company will not be removed from the contact lists on employee stations. Is there any way to force Outlook 2010 to overwrite the contact list? Oh how I long for the days of Outlook 2003 and its tidy .pab files.

    Read the article

  • Create xml file to be used in wordpress post import

    - by adedoy
    Alright here is the thing, I have this site that was once wordpress but have been converted into 70+ static pages, the admin is deleted and the whole site is static(which means every page is in index.html), I want to create a script that makes an xml so that I will just have to import it in the new wordpress install. So far, I am able to create an XML but it only imports one post. The data source is the URL of a page and I use jquery $get to filter only to gather the post of a given archive. //html <input type="text" class="full_path"> <input type="button" value="Get Data" class="getdata"> //script $('.getdata').click(function(){ $.get($('.full_path').val(), function(data) { post = $(data).find('div [style*="width:530px;"]'); $('.result').html(post.html()); }); });//get Data Through AJAX I send the cleaned data into a php below that creates the XML: $file = 'newpost.xml'; $post_data = $_REQUEST['post_data']; // Open the file to get existing content $current = file_get_contents($file); // Append a new post to the file $catStr = ''; if(isset($post_data['categories']) && count($post_data['categories']) > 0){ foreach($post_data['categories'] as $category) { $catStr .= '<category domain="category" nicename="'.$category.'"><![CDATA['.$category.']]></category>'; } } $tagStr = ''; if(isset($post_data['tags']) && count($post_data['tags']) > 0){ foreach($post_data['tags'] as $tag) { $tagStr = '<category domain="post_tag" nicename="'.$tag.'"><![CDATA['.$tag.']]></category>'; } } $post_name = str_replace(' ','-',$post_data["title"]); $post_name = str_replace(array('"','/',':','.',',','[',']','“','”'),'',strtolower($post_name)); $post_date = '2011-4-0'.rand(1, 29).''.rand(1, 12).':'.rand(1, 59).':'.rand(1, 59); $pubTime = rand(1, 12).':'.rand(1, 59).':'.rand(1, 59).' +0000'; $post = ' <item> <title>'.$post_data["title"].'</title> <link>'.$post_data["link"].'</link> <pubDate>'.$post_data["date"].' '.$pubTime.'</pubDate> <dc:creator>admin</dc:creator> <guid isPermaLink="false">http://localhost/saunders/?p=1</guid> <description></description> <content:encoded><![CDATA['.$post_data["content"].']]></content:encoded> <excerpt:encoded><![CDATA[]]></excerpt:encoded> <wp:post_id>1</wp:post_id> <wp:post_date>'.$post_date.'</wp:post_date> <wp:post_date_gmt>'.$post_date.'</wp:post_date_gmt> <wp:comment_status>open</wp:comment_status> <wp:ping_status>open</wp:ping_status> <wp:post_name>'.$post_name.'</wp:post_name> <wp:status>publish</wp:status> <wp:post_parent>0</wp:post_parent> <wp:menu_order>0</wp:menu_order> <wp:post_type>post</wp:post_type> <wp:post_password></wp:post_password> <wp:is_sticky>0</wp:is_sticky> '.$catStr.' '.$tagStr.' <wp:postmeta> <wp:meta_key>_edit_last</wp:meta_key> <wp:meta_value><![CDATA[1]]></wp:meta_value> </wp:postmeta> </item> '; // Write the contents back to the file with the appended post file_put_contents($file, $current.$post); After being appended I add the code below to complete the xml rss tag </channel> </rss> If I look and compare the xml file of one that is exported from a wordpress site, I see little difference. Please HELP!! here is a sample of a generated xml: <?xml version="1.0" encoding="UTF-8" ?> <rss version="2.0" xmlns:excerpt="http://wordpress.org/export/1.2/excerpt/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:wp="http://wordpress.org/export/1.2/" > <channel> <title>lols why</title> <link>http://localhost/lols</link> <description>Just another WordPress site</description> <pubDate>Wed, 03 Oct 2012 04:24:04 +0000</pubDate> <language>en-US</language> <wp:wxr_version>1.2</wp:wxr_version> <wp:base_site_url>http://localhost/lols</wp:base_site_url> <wp:base_blog_url>http://localhost/lols</wp:base_blog_url> <wp:author><wp:author_id>1</wp:author_id><wp:author_login>adedoy</wp:author_login><wp:author_email>[email protected]</wp:author_email><wp:author_display_name><![CDATA[adedoy]]></wp:author_display_name><wp:author_first_name><![CDATA[]]></wp:author_first_name><wp:author_last_name><![CDATA[]]></wp:author_last_name></wp:author> <generator>http://wordpress.org/?v=3.4.1</generator> <item> <title>Sample lift?</title> <link>../../breast-lift/delaware-breast-surgery-do-i-need-a-breast-lift/</link> <pubDate>Wed, 03 Oct 2012 9:29:16 +0000</pubDate> <dc:creator>admin</dc:creator> <guid isPermaLink="false">http://localhost/lols/?p=1</guid> <description></description> <content:encoded><![CDATA[<p>sample</p>]]></content:encoded> <excerpt:encoded><![CDATA[]]></excerpt:encoded> <wp:post_id>1</wp:post_id> <wp:post_date>2011-4-0132:45:4</wp:post_date> <wp:post_date_gmt>2011-4-0132:45:4</wp:post_date_gmt> <wp:comment_status>open</wp:comment_status> <wp:ping_status>open</wp:ping_status> <wp:post_name>sample-lift?</wp:post_name> <wp:status>publish</wp:status> <wp:post_parent>0</wp:post_parent> <wp:menu_order>0</wp:menu_order> <wp:post_type>post</wp:post_type> <wp:post_password></wp:post_password> <wp:is_sticky>0</wp:is_sticky> <category domain="category" nicename="Sample Lift"><![CDATA[Sample Lift]]></category><category domain="category" nicename="Sample Procedures"><![CDATA[Yeah Procedures]]></category> <category domain="post_tag" nicename="delaware"><![CDATA[delaware]]></category> <wp:postmeta> <wp:meta_key>_edit_last</wp:meta_key> <wp:meta_value><![CDATA[1]]></wp:meta_value> </wp:postmeta> </item> <item> <title>lalalalalala</title> <link>../../administrative-tips-for-surgery/delaware-cosmetic-surgery-a-better-experience/</link> <pubDate>Wed, 03 Oct 2012 3:20:43 +0000</pubDate> <dc:creator>admin</dc:creator> <guid isPermaLink="false">http://localhost/lols/?p=1</guid> <description></description> <content:encoded><![CDATA[ lalalalalala ]]></content:encoded> <excerpt:encoded><![CDATA[]]></excerpt:encoded> <wp:post_id>1</wp:post_id> <wp:post_date>2011-4-0124:39:30</wp:post_date> <wp:post_date_gmt>2011-4-0124:39:30</wp:post_date_gmt> <wp:comment_status>open</wp:comment_status> <wp:ping_status>open</wp:ping_status> <wp:post_name>lalalalalala</wp:post_name> <wp:status>publish</wp:status> <wp:post_parent>0</wp:post_parent> <wp:menu_order>0</wp:menu_order> <wp:post_type>post</wp:post_type> <wp:post_password></wp:post_password> <wp:is_sticky>0</wp:is_sticky> <category domain="category" nicename="lalalalalala"><![CDATA[lalalalalala]]></category> <category domain="post_tag" nicename="oink"><![CDATA[oink]]></category> <wp:postmeta> <wp:meta_key>_edit_last</wp:meta_key> <wp:meta_value><![CDATA[1]]></wp:meta_value> </wp:postmeta> </item> </channel> </rss> Please tell me what am I missing....

    Read the article

  • PHP Load variables from external file

    - by Adrian M.
    Hello, How can I import a variable from an external file? What I want to do is to have a configuration file in which I can write all my website settings and then to import these settings to every file, so I can set the website skin and things like that.. How can I do this? Thanks!

    Read the article

  • Python: Problem Importing Function From Another Module

    - by Rafid K. Abdullah
    I have a module called nbemail.py and in this module I want to use the function package_post defined in the module main.py. I am using this statement: from api.main import package_post But I am getting this error: ImportError: cannot import name package_post I really don't know why I am getting this error! I do have _init_.py files in the api directory (which contains the files nbemail.py and main.py) and I do have the function package_post defined in main.py. Any idea to help fixing this problem?

    Read the article

  • Looking for a Python IDE with good support for libraries (Twisted).

    - by Omega
    I'm looking for a Python IDE that can help me easily locate and manage and use the libraries on my system (Ubuntu). Specifically Twisted. Code completion is important including the symbols I import. (I've so far had a look at PyDev as well as OpenKomodo, but while both offer code completion for default Python concepts, I wasn't able to get either to import Twisted into my project and was thus getting reference errors.) Usual disclaimer: I don't like EMACS or vi, please, nothing regarding those.

    Read the article

  • How does QuickBooks handle IIF imports?

    - by dwwilson66
    I've received a 'template' for an IIF file for Quickbooks transactions, and there's like seventy-bazillion fields in there, lots of which I never even user. It's a tab delimited file, with the following lines--field headers for transactions and respective splits for those transactions, followed by an end-of-transaction marker. !TRNS FIELD1 FIELD2 FIELD3 ... FIELD48 !SPL FIELD1 FIELD2 FIELD3 ... FIELD48 !ENDTRNS TRNS FIELD1_DATA FIELD2_DATA FIELD3_DATA ... FIELD48_DATA SPL FIELD1_DATA FIELD2_DATA FIELD3_DATA ... FIELD48_DATA ENDTRNS ... What drives data to a particular field? Is it the field header with corresponding data, or is it the tabular position relative to the head of the line? E.G., Let's say all I have to import is the data in FIELD1, FIELD3 and FIELD5: Would I need by header: !TRNS FIELD1 FIELD3 FIELD5 !SPL FIELD1 FIELD3 FIELD5 !ENDTRNS TRNS FIELD1 FIELD3 FIELD5 SPL FIELD1 FIELD3 FIELD5 ENDTRNS or by tabular position: !TRNS FIELD1 FIELD2 FIELD3 FIELD4 FIELD5 !SPL FIELD1 FIELD2 FIELD3 FIELD4 FIELD5 !ENDTRNS TRNS FIELD1_DATA FIELD2_BLANK FIELD3_DATA FIELD4_BLANK FIELD5_DATA SPL FIELD1_DATA FIELD2_BLANK FIELD3_DATA FIELD4_BLANK FIELD5_DATA ENDTRNS Alternately, if it were a comma delimited input, would I need: DATA1,DATA3,DATA5 or DATA1,,DATA3,,DATA5 Anyone have any experience with what Quickbooks is doing?

    Read the article

  • Sharepoint AD imported users are becomming sporadically corrupted, causing us to have to create a ne

    - by TrevJen
    Sharepoint 2007 MOSS with AD imported users. All servers are 2008. I have around 50 users, over the past 2 months, I have had a handful of the users suddenly unable to login to Sharepoint. When they login, they either get a blank screen or they are repropmted. These users are using accounts that have been used for many months, sometimes the problem originates with a password change. In all cases, the users account works on every other Active Directory authenticated resource (domain, exchange, LDAP). In the most recent case, last night I was forced deleted a user ("John smith") because of corruption. The orifinal account name was jsmith. I deleted him from active directory, then deleted him from the profile list in Sharepoint Shared Services. I could not find a way to delete him from the Sharepoint user list, but I reran the import after recreating his account (renamed it too just to be sure to "smithj"). At first, this did not wor, the user could still access all other resources but Sharepoint. then, some 30 minutes later it inexplicably started working. This morning, the user changed passwords, which immediatly broke the login on Sharepoint again. I am at a loss on how to troubleshoot this.

    Read the article

  • Importing long numerical identifiers into Excel

    - by Niels Basjes
    I have some data in a database that uses ids that have the form of 16 digit numbers. In some situations i need to export the data in such a way that it can be manipulated in excel. So i export the data into a file and import it into excel. I've tried several file formats and I'm stuck. The problem I'm facing is that when reading a file into excel that has a cell that looks like a number then excel treats it as a number. The catch is that (as far as i can tell) all numerical values in excel are double precision floating point which have a precision of less than 16 digits. So my ids are changed: very often the last digit its changed to a 0. So far I've only been able to convince excel to keep the Id unchanged by breaking it myself: by adding a letter or symbol to the Id. This however means that in order to use the value again it must be "unbroken". Is there a way to create a file where i can specify that excel must treat the value as a text without changing the value? Or its there a way to let excel treat the value as a long (64bit integer)?

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • zxing project on android

    - by Aisthesis Cronos
    Hello everybody Many weeks ago,I tried to work on a mini project on Android OS requires ZXING, I followed several tutorials on this web site and on other Example: tuto1, and many tags and tutorials here tuto2, tuto3 ... But I failed each time. I can't import the android project into eclipse IDE to compile it with my code "not via Intent zxing APK-and my program like this example : private Button.OnClickListener btScanListener = new Button.OnClickListener() { public void onClick(View v) { Intent intent = new Intent("com.google.zxing.client.android.SCAN"); intent.putExtra("SCAN_MODE", "QR_CODE_MODE"); try { startActivityForResult(intent, REQUEST_SCAN); } catch (ActivityNotFoundException e) { Toast.makeText(Main.this, "Barcode Scanner not intaled ", 2000).show(); } } }; public void onActivityResult(int reqCode, int resCode, Intent intent) { if (REQUEST_SCAN == reqCode) { if (RESULT_OK == resCode) { String contents = intent.getStringExtra("SCAN_RESULT"); Toast.makeText(this, "Succès : " + contents, 2000).show(); } else if (RESULT_CANCELED == resCode) { Toast.makeText(this, "Scan annulé", 2000).show(); } } }` ". I feel disappointed, frustrated and sad. I still have errors after importing the project. I tried both versions 1.5 and 1.6 zxing I tried to import the project c: \ ZXing-1.6 \ android, and an other new project with c: \ ZXing-1.6 \ zxing-1.6 \ android,I cheked out SVN: ttp: / / zxing.googlecode.com / svn / trunk / zxing-read-only with tortoiseSVN and reproduce the same work, but unfortunately without results! I really fed up with myself ... Please help me to solve this problem.how can I import the project and compile it correctly in my own project? 1 - I use Windows 7 64-bit Home Premium 2 - Eclipse IDE for Java EE Web Developers. Version: Helios Service Release 2 Build id: 20110218-0911 What is the effective and sure method to run this, otherwise if there is a video or a guide details or someone who already done it previously I would really appreciate it if someone would help me out

    Read the article

  • Error doing an MSBuild on a CLR Storedprocedure project on Build Server

    - by CraftyFella
    Hi, When building a CLR Storedprocedure Project using MSBuild on our build server (Team City) we're getting the following error: error MSB4019: The imported project "C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\SqlServer.targets" was not found. Confirm that the path in the declaration is correct, and that the file exists on disk I've checked to see if the file exists on disk and sure enough it doesn't. I've checked on my own machine and it does exist. I don't really want to start copying over files manually to the build server. Here's the line from the csproj file which is being imported to the proj file: <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" /> <Import Project="$(MSBuildToolsPath)\SqlServer.targets" /> Here's the line from the proj file which is begin run by our Team City Server: <Import Project="..\$(ProjectName).csproj"/> My question is really: Where does this file comes from? Is it part of the Visual Studio install for example.. Or is there some re-distribution package somewhere to allow me to compile this project on our build server? Thanks BTW.. if i just copy the file onto the Build server it does actually work. Dave

    Read the article

  • Problem importing Oracle .dmp file

    - by BitFiddler
    So I have looked at all the suggested ways of importing .dmp files and non of them seem to answer this question: where does the data go once you import it? Context: I created a user like so: SQL> create user IMPORTER identified by "12345"; SQL> grant connect, unlimited tablespace, resource to IMPORTER; I then ran the 'imp' command as follows: C:\>imp system/password FROMUSER=OVIEDOE TOUSER=IMPORTER file=c:\database1.dmp Now there were 9 .dmp files, after each one it asked me for the next one and then I received the message "Import terminated successfully with warnings." The warning was: Warning: the objects were exported by OVIEDOE, not by you import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set export client uses WE8ISO8859P1 character set (possible charset conversion) IMP-00046: using FILESIZE value from export file of 2147483648 Now it says it was terminated successfully so my assumption (I am new to oracle so this may be wrong) is that the data was loaded. However, when I use SQL developer to connect to the database and look under the 'tables' node under the IMPORTER user, there is nothing there. What is going on? Did the data load? If so, where can I find it?

    Read the article

  • Scope of "library" methods

    - by JS
    Hello, I'm apparently laboring under a poor understanding of Python scoping. Perhaps you can help. Background: I'm using the 'if name in "main"' construct to perform "self-tests" in my module(s). Each self test makes calls to the various public methods and prints their results for visual checking as I develop the modules. To keep things "purdy" and manageable, I've created a small method to simplify the testing of method calls: def pprint_vars(var_in): print("%s = '%s'" % (var_in, eval(var_in))) Calling pprint_vars with: pprint_vars('some_variable_name') prints: some_variable_name = 'foo' All fine and good. Problem statement: Not happy to just KISS, I had the brain-drizzle to move my handy-dandy 'pprint_vars' method into a separate file named 'debug_tools.py' and simply import 'debug_tools' whenever I wanted access to 'pprint_vars'. Here's where things fall apart. I would expect import debug_tools foo = bar debug_tools.pprint_vars('foo') to continue working its magic and print: foo = 'bar' Instead, it greets me with: NameError: name 'some_var' is not defined Irrational belief: I believed (apparently mistakenly) that import puts imported methods (more or less) "inline" with the code, and thus the variable scoping rules would remain similar to if the method were defined inline. Plea for help: Can someone please correct my (mis)understanding of scoping regards imports? Thanks, JS

    Read the article

  • Problem importing Oracle .dmp file

    - by BitFiddler
    So I have looked at all the suggested ways of importing .dmp files and non of them seem to answer this question: where does the data go once you import it? Context: I created a user like so: SQL> create user IMPORTER identified by "12345"; SQL> grant connect, unlimited tablespace, resource to IMPORTER; I then ran the 'imp' command as follows: C:\>imp system/password FROMUSER=OVIEDOE TOUSER=IMPORTER file=c:\database1.dmp Now there were 9 .dmp files, after each one it asked me for the next one and then I received the message "Import terminated successfully with warnings." The warning was: Warning: the objects were exported by OVIEDOE, not by you import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set export client uses WE8ISO8859P1 character set (possible charset conversion) IMP-00046: using FILESIZE value from export file of 2147483648 Now it says it was terminated successfully so my assumption (I am new to oracle so this may be wrong) is that the data was loaded. However, when I use SQL developer to connect to the database and look under the 'tables' node under the IMPORTER user, there is nothing there. What is going on? Did the data load? If so, where can I find it?

    Read the article

  • Convert C function to Objective C or alternative way to include C library?

    - by Moshe
    Background: I'm new to Objective-C and I've never touched C before. I'm trying to include a C library in my iPhone/iPod Touch project and I could not successfully compile the library's source code using the included instructions, so I've resorted to including the .h and .c files in Xcode. The library is for Hebrew Dates and corresponding Jewish Holidays. It can be found here: http://sourceforge.net/projects/libhdate/ I cannot seem to import the .c files into my implementation files. I can import the .h files though. When I try to use #import "file.c", where file.c is a file that is in XCode, it doesn't work. Why not? I've considered just writing the functions over in Objective-C, albeit only my needed functions and not the whole library. How can I make the following C function work in Objective-C? Are there other things that need to be included/re-coded/compiled? How so? I am almost certain something is missing, but I'm not sure what. Code: int hdate_get_omer_day(hdate_struct const * h) { int omer_day; hdate_struct sixteen_nissan; hdate_set_hdate(&sixteen_nissan, 16, 7, h->hd_year); omer_day = h->hd_jd - sixteen_nissan.hd_jd + 1; if ((omer_day > 49) || (omer_day < 0)) omer_day = 0; return omer_day; } So... should I be converting it, or trying somehow to compile to an appropriate format and how so? (I don't know what format a static library should be in, by the way, or if it should be static... - I'm so lost here....) I appreciate the help!

    Read the article

  • Migrate from Oracle to MySQL

    - by Cassy
    Hi together. We ran into serious performance problems with our Oracle database and we would like to try to migrate to a MySQL-based database (either MySQL directly or, more preferrable, Infobright). The thing is, we need to let the old and the new system overlap for at least some weeks if not months, before we actually know, if all features of the new database match our needs. So, here is our situation: The Oracle database consists of multiple tables with each millions of rows. During the day, there are literally thousands of statements, which we cannot stop for migration. Every morning, new data is imported into the Oracle database, replacing some thousands of rows. Copying this process is not a problem, so we could, in theory, import in both databases in parallel. But, and here lies the challenge, for this to work, we need to have an export from the Oracle database with a consistent state from one day. (We cannot export some tables on Monday and some others on Tuesday, etc.) This means, that at least the export should be finished in less than one day. Our first thought was to dump the schema, but I wasn't able to find a tool to import an Oracle dump file into mysql. Exporting tables in CSV files might work, but I'm afraid it could take too long. So my question now is: What should I do? Is there any tool to import Oracle dump files into MySQL? Does anybody have any experience with such a large-scale migration? Thanks in advance, Cassy PS: Please, don't suggest performance optimization techniques for Oracle, we already tried a lot :-)

    Read the article

  • Unable to import Maven project into IntelliJ IDEA

    - by del
    I'm having problems importing any Maven projects into IntelliJ IDEA. I create an empty Maven project like this: $ mvn archetype:generate -DgroupId=com.mycompany.app -DartifactId=my-app -DarchetypeArtifactId=maven-archetype-quickstart -DinteractiveMode=false Then I try to open the project in IDEA (File Open Project, then choose the pom.xml). A progress box saying "Reading pom.xml" displays for a few minutes, and then just disappears without opening the project. Looking in the IDEA log, I see some connection timeout exceptions like this: 2012-10-03 11:55:55,483 [ 16981] INFO - ution.rmi.RemoteProcessSupport - Port/ID: 18011/Maven2ServerImpl9407569f 2012-10-03 11:56:58,898 [ 80396] WARN - ution.rmi.RemoteProcessSupport - The cook failed to start due to java.net.ConnectException: Connection timed out 2012-10-03 11:57:55,483 [ 136981] WARN - ution.rmi.RemoteProcessSupport - java.rmi.NotBoundException: _DEAD_HAND_ 2012-10-03 11:57:55,484 [ 136982] WARN - ution.rmi.RemoteProcessSupport - at sun.rmi.registry.RegistryImpl.lookup(RegistryImpl.java:106) 2012-10-03 11:57:55,484 [ 136982] WARN - ution.rmi.RemoteProcessSupport - at com.intellij.execution.rmi.RemoteServer.start(RemoteServer.java:73) 2012-10-03 11:57:55,484 [ 136982] WARN - ution.rmi.RemoteProcessSupport - at org.jetbrains.idea.maven.server.RemoteMavenServer.main(RemoteMavenServer.java:22) 2012-10-03 11:58:01,749 [ 143247] ERROR - com.intellij.ide.IdeEventQueue - Error during dispatching of java.awt.event.MouseEvent[MOUSE_RELEASED,(65,116),absolute(64,140),button=1,modifiers=Button1,clickCount=1] on frame0 java.lang.RuntimeException: Cannot reconnect. at org.jetbrains.idea.maven.server.RemoteObjectWrapper.perform(RemoteObjectWrapper.java:82) at org.jetbrains.idea.maven.server.MavenServerManager.applyProfiles(MavenServerManager.java:311) at org.jetbrains.idea.maven.project.MavenProjectReader.applyProfiles(MavenProjectReader.java:369) at org.jetbrains.idea.maven.project.MavenProjectReader.doReadProjectModel(MavenProjectReader.java:98) at org.jetbrains.idea.maven.project.MavenProjectReader.readProject(MavenProjectReader.java:52) at org.jetbrains.idea.maven.project.MavenProject.read(MavenProject.java:405) at org.jetbrains.idea.maven.project.MavenProjectsTree.doUpdate(MavenProjectsTree.java:534) at org.jetbrains.idea.maven.project.MavenProjectsTree.doAdd(MavenProjectsTree.java:481) at org.jetbrains.idea.maven.project.MavenProjectsTree.update(MavenProjectsTree.java:442) at org.jetbrains.idea.maven.project.MavenProjectsTree.updateAll(MavenProjectsTree.java:413) at org.jetbrains.idea.maven.wizards.MavenProjectBuilder.readMavenProjectTree(MavenProjectBuilder.java:198) at org.jetbrains.idea.maven.wizards.MavenProjectBuilder.access$800(MavenProjectBuilder.java:44) at org.jetbrains.idea.maven.wizards.MavenProjectBuilder$3.run(MavenProjectBuilder.java:179) at org.jetbrains.idea.maven.utils.MavenUtil$8.run(MavenUtil.java:388) at com.intellij.openapi.progress.impl.ProgressManagerImpl$TaskRunnable.run(ProgressManagerImpl.java:469) at com.intellij.openapi.progress.impl.ProgressManagerImpl$6.run(ProgressManagerImpl.java:288) at com.intellij.openapi.progress.impl.ProgressManagerImpl$2.run(ProgressManagerImpl.java:178) at com.intellij.openapi.progress.impl.ProgressManagerImpl.executeProcessUnderProgress(ProgressManagerImpl.java:218) at com.intellij.openapi.progress.impl.ProgressManagerImpl.runProcess(ProgressManagerImpl.java:169) at com.intellij.openapi.application.impl.ApplicationImpl$8$1.run(ApplicationImpl.java:641) at com.intellij.openapi.application.impl.ApplicationImpl$6.run(ApplicationImpl.java:434) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:662) at com.intellij.openapi.application.impl.ApplicationImpl$1$1.run(ApplicationImpl.java:145) Caused by: java.rmi.RemoteException: Cannot start maven service; nested exception is: java.rmi.ConnectException: Connection refused to host: localhost; nested exception is: java.net.ConnectException: Connection timed out at org.jetbrains.idea.maven.server.MavenServerManager.create(MavenServerManager.java:120) at org.jetbrains.idea.maven.server.MavenServerManager.create(MavenServerManager.java:71) at org.jetbrains.idea.maven.server.RemoteObjectWrapper.getOrCreateWrappee(RemoteObjectWrapper.java:41) at org.jetbrains.idea.maven.server.MavenServerManager$8.execute(MavenServerManager.java:314) at org.jetbrains.idea.maven.server.MavenServerManager$8.execute(MavenServerManager.java:311) at org.jetbrains.idea.maven.server.RemoteObjectWrapper.perform(RemoteObjectWrapper.java:76) ... 27 more Caused by: java.rmi.ConnectException: Connection refused to host: localhost; nested exception is: java.net.ConnectException: Connection timed out at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:601) at sun.rmi.transport.tcp.TCPChannel.createConnection(TCPChannel.java:198) at sun.rmi.transport.tcp.TCPChannel.newConnection(TCPChannel.java:184) at sun.rmi.server.UnicastRef.newCall(UnicastRef.java:322) at sun.rmi.registry.RegistryImpl_Stub.lookup(Unknown Source) at com.intellij.execution.rmi.RemoteProcessSupport$2.compute(RemoteProcessSupport.java:215) at com.intellij.execution.rmi.RemoteUtil.executeWithClassLoader(RemoteUtil.java:122) at com.intellij.execution.rmi.RemoteProcessSupport.acquire(RemoteProcessSupport.java:212) at com.intellij.execution.rmi.RemoteProcessSupport.acquire(RemoteProcessSupport.java:133) at org.jetbrains.idea.maven.server.MavenServerManager.create(MavenServerManager.java:117) ... 32 more Caused by: java.net.ConnectException: Connection timed out at java.net.PlainSocketImpl.socketConnect(Native Method) at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:351) at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:213) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:200) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:366) at java.net.Socket.connect(Socket.java:529) at java.net.Socket.connect(Socket.java:478) at java.net.Socket.(Socket.java:375) at java.net.Socket.(Socket.java:189) at sun.rmi.transport.proxy.RMIDirectSocketFactory.createSocket(RMIDirectSocketFactory.java:22) at sun.rmi.transport.proxy.RMIMasterSocketFactory.createSocket(RMIMasterSocketFactory.java:128) at sun.rmi.transport.tcp.TCPEndpoint.newSocket(TCPEndpoint.java:595) ... 41 more I'm using the latest versions of IDEA (11.1.3) and Maven (3.0.4). Any ideas what I am doing wrong?

    Read the article

  • how to import csv data into django models

    - by little_fish
    i have some csv data and i want to export into django models the example of csv data 1;"02-01-101101";"Worm Gear HRF 50";"Ratio 1 : 10";"input shaft, output shaft, direction A, color dark green"; 2;"02-01-101102";"Worm Gear HRF 50";"Ratio 1 : 20";"input shaft, output shaft, direction A, color dark green"; 3;"02-01-101103";"Worm Gear HRF 50";"Ratio 1 : 30";"input shaft, output shaft, direction A, color dark green"; 4;"02-01-101104";"Worm Gear HRF 50";"Ratio 1 : 40";"input shaft, output shaft, direction A, color dark green"; 5;"02-01-101105";"Worm Gear HRF 50";"Ratio 1 : 50";"input shaft, output shaft, direction A, color dark green"; and i have some django models name Product in Product there is some fields like name, description and price and i want to something like this product=Product() product.name = "Worm Gear HRF 70(02-01-101116)" product.description = "input shaft, output shaft, direction A, color dark green" product.price = 100

    Read the article

  • Unresolved import: models

    - by Timmy O' Tool
    Hi! I'm doing my VERY first project using python/django/eclipse/pydev following this guide http://docs.djangoproject.com/en/dev/intro/tutorial01/ My only addition is the use of Eclipse/pydev. I'm getting many errors related to "Unresolved imports". I can remove the errors using "remove error markers" and my site runs perfect (I can browse it) but I want to get rid definitively of this problem since errors appear again after I removed them. Any ideas?

    Read the article

  • How to use Windows live contacts api to allow users to import contacts

    - by AlfaTeK
    I can't find a good, simple tutorial on how to achieve this in PHP / Javascript. Basically the same thing FB has where the users specifies their hotmail account and then gives permission to FB to access their contacts. As the API seems to have changed most tutorials simply don't work like: http://livecontactsphp.codeplex.com/ which doesn't get the contacts (but authenticates and gets permission from user)

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >