Search Results

Search found 10691 results on 428 pages for 'batch insert'.

Page 88/428 | < Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >

  • SQLITE - INSERT does not return error but no data is inserted.

    - by Nick
    I am attempting to insert data into a local SQLITE database file from a C# application. The transaction does not throw any errors but the data is not inserted. The same insert statement works from within a query analyzer. Do I need to perform a commit? Is there a Commit method? Command's transaction property is null.. var command = new SQLiteCommand(insert.BuildInsert(tableName,keyValuePairs),Connection); command.ExecuteNonQuery();

    Read the article

  • How to insert a value based on lookup from another table [SQL]?

    - by Shaitan00
    I need to find a way to do an INSERT INTO table A but one of the values is something that comes from a lookup on table B, allow me to illustrate. I have the 2 following tables: Table A: A1: String A2: Integer value coming from table B A3: More Data Table B: B1: String B2: Integer Value Example row of A: {"Value", 101, MoreData} Example row of B: {"English", 101} Now, I know I need to INSERT the following into A {"Value2", "English", MoreData} but obviously that won't work because it is expecting an Integer in the second column not the word "English", so I need to do a lookup in Table B first. Something like this: INSERT INTO tableA (A1, A2, A3) VALUES ("Value2", SELECT B2 FROM tableB where B1="English", MoreData); Obviously this doesn't work as-is ... Any suggestions?

    Read the article

  • Insert a default row into a combobox that is bound to a datatable?

    - by John M
    On a winform there is a combobox that derives its information from a datatable. The datatable draws from a database list. this.cboList.DataSource = pullData(); this.cboList.DisplayMember = "fieldA"; Once the DataSource is set I am not able to insert a default row (ie *) as the first item in the combobox. I tried this: this.cboList.Items.Insert(0,"*"); Is there a way to insert in the combobox after the datasource is set or should this be done in the datatable?

    Read the article

  • Overload and hide methods in Java

    - by Marco
    Hi, i have an abstract class BaseClass with a public insert() method: public abstract class BaseClass { public void insert(Object object) { // Do something } } which is extended by many other classes. For some of those classes, however, the insert() method must have additional parameters, so that they instead of overriding it I overload the method of the base class with the parameters required, for example: public class SampleClass extends BaseClass { public void insert(Object object, Long param){ // Do Something } } Now, if i instantiate the SampleClass class, i have two insert() methods: SampleClass sampleClass = new SampleClass(); sampleClass.insert(Object object); sampleClass.insert(Object object, Long param); what i'd like to do is to hide the insert() method defined in the base class, so that just the overload would be visible: SampleClass sampleClass = new SampleClass(); sampleClass.insert(Object object, Long param); Could this be done in OOP?

    Read the article

  • How to run a progress-bar through an insert query?

    - by Gold
    I have this insert query: try { Cmd.CommandText = @"INSERT INTO BarcodTbl SELECT * FROM [Text;DATABASE=" + PathI + @"\].[Tmp.txt];"; Cmd.ExecuteNonQuery(); Cmd.Dispose(); } catch (Exception ex) { MessageBox.Show(ex.Message); } I have two questions: How can I run a progress-bar from the beginning to the end of the insert? If there is an error, I got the error exception and the action will stop - the query stops and the BarcodTbl is empty. How I can see the error and allow the query to continue filling the table?

    Read the article

  • INSERT INTO ... SELECT ... vs dumping/loading a file in MySQL

    - by Daniel Huckstep
    What are the implications of using a INSERT INTO foo ... SELECT FROM bar JOIN baz ... style insert statement versus using the same SELECT statement to dump (bar, baz) to a file, and then insert into foo by loading the file? In my messing around, I haven't seen a huge difference. I would assume the former would use more memory, but the machine that this runs on has 8GB of RAM, and I never even see it go past half used. Are there any huge (or long term) performance implications that I'm not seeing? Advantages/disadvantages of either?

    Read the article

  • How to insert large files in mysql database using php? [closed]

    - by anjan
    Hi! I want to upload a large file of size 10M max to my mysql database. Using .htaccess i changed the PHP's own file upload limit to "10485760" = 10M, i am able to upload files upto 10M size without any problem. But i can not insert the file in database if it is more that 1M in size. i am using file_get_contents to read all file data and pass it to the insert query as a string to be inserted into a LONGBLOB field. But files with more than 1M size is not being added to database, though i can use print_r($_FILES) to examine that the file uploaded correctly. Any help will be appreciated and i will need it within next 6 hours. So, please help! best regards, Anjan * This is a duplicate of http://stackoverflow.com/questions/492549/how-can-i-insert-large-files-in-mysql-db-using-php *

    Read the article

  • Problem to match font size to the screen resolution in libgdx

    - by Iñaki Bedoya
    I'm having problems to show text on my game at same size on different screens, and I did a simple test. This test consists to show a text fitting at the screen, I want the text has the same size independently from the screen and from DPI. I've found this and this answer that I think should solve my problem but don't. In desktop the size is ok, but in my phone is too big. This is the result on my Nexus 4: (768x1280, 2.0 density) And this is the result on my MacBook: (480x800, 0.6875 density) I'm using the Open Sans Condensed (link to google fonts) As you can see on desktop looks good, but on the phone is so big. Here the code of my test: public class TextTest extends ApplicationAdapter { private static final String TAG = TextTest.class.getName(); private static final String TEXT = "Tap the screen to start"; private OrthographicCamera camera; private Viewport viewport; private SpriteBatch batch; private BitmapFont font; @Override public void create () { Gdx.app.log(TAG, "Screen size: "+Gdx.graphics.getWidth()+"x"+Gdx.graphics.getHeight()); Gdx.app.log(TAG, "Density: "+Gdx.graphics.getDensity()); camera = new OrthographicCamera(); viewport = new ExtendViewport(Gdx.graphics.getWidth(), Gdx.graphics.getWidth(), camera); batch = new SpriteBatch(); FreeTypeFontGenerator generator = new FreeTypeFontGenerator(Gdx.files.internal("fonts/OpenSans-CondLight.ttf")); font = createFont(generator, 64); generator.dispose(); } private BitmapFont createFont(FreeTypeFontGenerator generator, float dp) { FreeTypeFontGenerator.FreeTypeFontParameter parameter = new FreeTypeFontGenerator.FreeTypeFontParameter(); int fontSize = (int)(dp * Gdx.graphics.getDensity()); parameter.size = fontSize; Gdx.app.log(TAG, "Font size: "+fontSize+"px"); return generator.generateFont(parameter); } @Override public void render () { Gdx.gl.glClearColor(1, 1, 1, 1); Gdx.gl.glClear(GL20.GL_COLOR_BUFFER_BIT); int w = -(int)(font.getBounds(TEXT).width / 2); batch.setProjectionMatrix(camera.combined); batch.begin(); font.setColor(Color.BLACK); font.draw(batch, TEXT, w, 0); batch.end(); } @Override public void resize(int width, int height) { viewport.update(width, height); } @Override public void dispose() { font.dispose(); batch.dispose(); } } I'm trying to find a neat way to fix this. What I'm doing wrong? is the camera? the viewport? UPDATE: What I want is to keep the same margins in proportion, independently of the screen size or resolution. This image illustrates what I mean.

    Read the article

  • How to implement dynamic binary search for search and insert operations of n element (C or C++)

    - by iecut
    The idea is to use multiple arrays, each of length 2^k, to store n elements, according to binary representation of n.Each array is sorted and different arrays are not ordered in any way. In the above mentioned data structure, SEARCH is carried out by a sequence of binary search on each array. INSERT is carried out by a sequence of merge of arrays of the same length until an empty array is reached. More Detail: Lets suppose we have a vertical array of length 2^k and to each node of that array there attached horizontal array of length 2^k. That is, to the first node of vertical array, a horizontal array of length 2^0=1 is connected,to the second node of vertical array, a horizontal array of length 2^1= 2 is connected and so on. So the insert is first carried out in the first horizontal array, for the second insert the first array becomes empty and second horizontal array is full with 2 elements, for the third insert 1st and 2nd array horiz. array are filled and so on. I implemented the normal binary search for search and insert as follows: int main() { int a[20]= {0}; int n, i, j, temp; int *beg, *end, *mid, target; printf(" enter the total integers you want to enter (make it less then 20):\n"); scanf("%d", &n); if (n = 20) return 0; printf(" enter the integer array elements:\n" ); for(i = 0; i < n; i++) { scanf("%d", &a[i]); } // sort the loaded array, binary search! for(i = 0; i < n-1; i++) { for(j = 0; j < n-i-1; j++) { if (a[j+1] < a[j]) { temp = a[j]; a[j] = a[j+1]; a[j+1] = temp; } } } printf(" the sorted numbers are:"); for(i = 0; i < n; i++) { printf("%d ", a[i]); } // point to beginning and end of the array beg = &a[0]; end = &a[n]; // use n = one element past the loaded array! // mid should point somewhere in the middle of these addresses mid = beg += n/2; printf("\n enter the number to be searched:"); scanf("%d",&target); // binary search, there is an AND in the middle of while()!!! while((beg <= end) && (*mid != target)) { // is the target in lower or upper half? if (target < *mid) { end = mid - 1; // new end n = n/2; mid = beg += n/2; // new middle } else { beg = mid + 1; // new beginning n = n/2; mid = beg += n/2; // new middle } } // find the target? if (*mid == target) { printf("\n %d found!", target); } else { printf("\n %d not found!", target); } getchar(); // trap enter getchar(); // wait return 0; } Could anyone please suggest how to modify this program or a new program to implement dynamic binary search that works as explained above!!

    Read the article

  • Wix - How do I specify a directory to run a batch file in?

    - by Mike Pateras
    I want to run a batch file, which I do via the following: <CustomAction Id='InstallFilter' FileKey='install' ExeCommand='' Execute='deferred' /> <InstallExecuteSequence> <Custom Action='InstallFilter' Before='InstallFinalize' /> </InstallExecuteSequence> This will execute the batch file, but it runs in C:\Windows\System32 (or something like that). I want it to run in the directory that the file is found in. It won'et let me specify a Directory attribute with a FileKey attribute. How can I tell the installer to run out of a specific directory, preferably by the directory ID. Also, when I try to uninstall my app with script, I get an error message saying "There is a problem with the Windows Installer package. A program required for the install to complete could not be run." This makes sense, as by the time the script gets run, the files have been removed. The question is: How do I specify that my action should only be run on install, not uninstall? How do I uninstall this current copy?

    Read the article

  • Import MySQL file in PHP

    - by Cudos
    I have a MySQL file which I want to import via PHP 5. In the name of user friendliness the user should not use tools like PHPmyadmin etc. Just hit a button and the file will get imported. I have already created code to upload the file to a location on the server. The file looks like this: INSERT INTO products VALUES ('', '0', '10', '', '1', 'be34112', '4536.jpg', '','','','0'); SET @master_id = LAST_INSERT_ID(); INSERT INTO products_description VALUES ('', '1', @master_id, '1', 'Kjole', '', 'beskrivelse', '2000', '25', 'kjole.xml', '', '', ''); INSERT INTO products_to_categories VALUES ('',@master_id,'5'); INSERT INTO products VALUES ('', @master_id, '10', '12', '1', 'be34112', '4536.jpg', '200','','','0'); SET @variant_id = LAST_INSERT_ID(); INSERT INTO products_description VALUES ('', '1', @variant_id, '1', 'Kjole', '', 'beskrivelse', '2000', '25', 'kjole.xml', '', '', ''); INSERT INTO options_to_products VALUES ('', @variant_id, '1', '1'); INSERT INTO options_to_products VALUES ('', @variant_id, '', '2'); INSERT INTO products VALUES ('', @master_id, '20', '17', '1', 'be34113', '4537.jpg', '200','','','0'); SET @variant_id = LAST_INSERT_ID(); INSERT INTO products_description VALUES ('', '1', @variant_id, '1', 'Kjole', '', 'beskrivelse æøå ÆØÅ & íjj´¨¨¨¨fdfd""', '3000', '25', 'kjole.xml', '', '', ''); INSERT INTO options_to_products VALUES ('', @variant_id, '1', ''); INSERT INTO options_to_products VALUES ('', @variant_id, '', '4');

    Read the article

  • How to pass an integration property to a batch file with CruiseControlNet ?

    - by TridenT
    In the build log of my project, i can see these properties: <integrationProperties> <CCNetProject>Gdet_T</CCNetProject> ... <LastChangeNumber>0</LastChangeNumber> <LastIntegrationStatus>Success</LastIntegrationStatus> <LastSuccessfulIntegrationLabel>25</LastSuccessfulIntegrationLabel> <LastModificationDate>4/6/2010 1:29:04 PM</LastModificationDate> <LastChangeNumber>10841</LastChangeNumber> </integrationProperties> I want to pass the property CCNetProject and LastChangeNumber to a batch file. it works well with CCNetProject, as it can be used in the batch as an environment variable %CCNetProject%. But it doesn't work with other properties (those are not starting with the CCnet prefix) as LastChangeNumber or LastModificationDate. I tried to pass it as environment variable, but it fails ! <exec> <executable>$(WorkingFolderBase)\MyBatch.bat</executable> <baseDirectory>$(WorkingFolderBase)\</baseDirectory> <buildArgs>$(LastModificationDate)</buildArgs> </exec> I tried to pass it as argument, but it fails: <exec> <executable>$(WorkingFolderBase)\MyBatch.bat</executable> <baseDirectory>$(WorkingFolderBase)\</baseDirectory> <environment> <variable> <name>svn_label</name> <value>"${LastModificationDate}"</value> </variable> </environment> </exec> The results is always the same when I display the parameter or variable : empty string or the variable name $(svn_label) I'm sure it is simple, but ... I can't find ! Any idea ?

    Read the article

  • Which is the best way to encode batch videos on server side?

    - by albanx
    Hello I am making a general question since I am a developer and I have no advance experience on video elaboration. I have to preparare a web application with the purpose to allow video files upload on our company server and then video elaboration by server, on user command. The purpose of the web application is to allow to the user to make some elaboration on video depending on user action launch from the web app: (server has to ) convert video in different format(mp4, flv...) extact keyframes from video and saves them in jpeg format possibility to extract audio from video automatic control of quality audio & video (black frames,silences detection) change scene detection and keyframe extraction ..... This what's my bosses wanted from the web based application (with the server support obviously), and I understand only the first 3 points of this list, the rest for me was arabic.... My question is: Which is the best and fastest server side application for this works, that can support multiple batch video conversions, from command line (comand line for php-soap-socket interaction or something else..)? Is suitable Adobe Media Server for batch video conversion? Which are adobe products that can be used for this purpose? Note: I have experience with Indesign Server scripting programing (sending xml with php and soap call...), and I am looking to something similiar for video elaboration. I will appreciate any answers. THANKS ALL

    Read the article

  • mysql report sql help

    - by sfgroups
    I have mysql table with data like this. record will have server with total cpu and virtual server with cpu assinged type, cpu srv1, 10 vsrv11, 2 vsrv12, 3 srv2, 15 vsrv21, 6 vsrv22, 7 vsrv23, 1 from the above data, I want to create output like this. server, total cpu, assigned cpu, free cpu srv1, 10, 5, 5 srv2, 15, 14, 1 Can you help me on creating sql query for this report? I have changed my table and data like this. CREATE TABLE `cpuallocation` ( `servertype` varchar(10) DEFAULT NULL, `servername` varchar(20) DEFAULT NULL, `hostname` varchar(20) DEFAULT NULL, `cpu_count` float DEFAULT NULL, UNIQUE KEY `server_uniq_idx` (`servertype`,`servername`,`hostname`) insert into cpuallocation values('srv', 'server1', '',16); insert into cpuallocation values('vir', 'server1', 'host1',5); insert into cpuallocation values('vir', 'server1', 'host2',2.5); insert into cpuallocation values('vir', 'server1', 'host3',4.5); insert into cpuallocation values('srv', 'server2', '',8); insert into cpuallocation values('vir', 'server2', 'host1',5); insert into cpuallocation values('vir', 'server2', 'host2',2.5); insert into cpuallocation values('srv', 'server3', '',24); insert into cpuallocation values('vir', 'server3', 'host1',12); insert into cpuallocation values('vir', 'server3', 'host2',2); insert into cpuallocation values('srv', 'server4', '',12); Update: I created two view, now I getting the result I want. create view v1 as select servername, sum(cpu_count) as cpu_allocated from cpuallocation where servertype='vir' group by servername; create view v2 as select servername, cpu_count as total_cpu from cpuallocation where servertype='srv'; select a.servername, a.total_cpu, b.cpu_allocated from v2 as a left join v1 as b on a.servername=b.servername; +------------+-----------+---------------+ | servername | total_cpu | cpu_allocated | +------------+-----------+---------------+ | server1 | 16 | 12 | | server2 | 8 | 7.5 | | server3 | 24 | 14 | | server4 | 12 | NULL | +------------+-----------+---------------+ 4 rows in set (0.00 sec) Is it possible to create a query with-out creating views?

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • How can I can insert the contents of a file into another file right before a specific line

    - by jelbatnigi
    Hi, How can I can insert the contents of a file into another file right before a specific line using sed. example I have file1.xml that has the following: <field tagRef="376"> </field> <field tagRef="377"> </field> <field tagRef="58"> </field> <group ref="StandardMessageTrailer" required="true"/> </fieldList> and file2.xml has the following: <field tagRef="9647"> <description>Offset</description> </field> <field tagRef="9648"> <description>Offset Units/Direction</description> </field> <field tagRef="9646"> <description>Anchor Price</description> </field> how can i insert the contents of file2 into file1 just before so it will look like this: <field tagRef="376"> </field> <field tagRef="377"> </field> <field tagRef="58"> </field> <field tagRef="9647"> <description>Offset</description> </field> <field tagRef="9648"> <description>Offset Units/Direction</description> </field> <field tagRef="9646"> <description>Anchor Price</description> </field> <group ref="StandardMessageTrailer" required="true"/> </fieldList> I know how to insert after that line using sed 'group ref="StandardMessageTrailer"/r file2.xml' file1.xml newfile.xml but I want to insert it before. appreciate the help

    Read the article

  • Clear the data from the control after the insert the row?

    - by kathy
    Hi,, I'm using aspxgridview and after adding operation by using RowInserting event and I don't use the gridView.CancelEdit(); becuase I need the addform still apperaing to insert another row. My problem is : how can I clean the data from the control in the addform until I can insert the new data for the second row ?? thanks

    Read the article

  • How to bulk insert from CSV when some fields have new line character?

    - by z-boss
    I have a CSV dump from another DB that looks like this (id, name, notes): 1001,John Smith,15 Main Street 1002,Jane Smith,"2010 Rockliffe Dr. Pleasantville, IL USA" 1003,Bill Karr,2820 West Ave. The last field may contain carriage returns and commas, in which case it is surrounded by double quotes. I use this code to import CSV into my table: BULK INSERT CSVTest FROM 'c:\csvfile.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' ) SQL Server 2005 bulk insert cannot figure out that carriage returns inside quotes are not row terminators. How to overcome?

    Read the article

  • how to insert using linq to entity foreign key value?

    - by Gamble
    I have a table (TestTable) for example: ID Name Parentid (FK) and I would like to insert a new record like ID(1) Name(Test) ParentID(5) FK. How can insert a new record into TestTable with linq to entity? var testTable = new TestTable(); testTable.ID = 1; testTable.Name = "TestName"; testTable ... thank you for the working example.

    Read the article

  • Why can't I insert record with foregion key in a single server request?

    - by Eran Betzalel
    I'm tryring to do a simple insert with foregion key, but it seems that I need to use db.SaveChanges() for every record insert. How can I manage to use only one db.SaveChanges() at the end of this program? foreach (var file in files) { db.AddToFileSet(file); db.SaveChanges(); db.AddToDirectorySet( new GlxCustomerPhone { SimIdentifier = file.Name + "Dir", CreationDate = DateTime.UtcNow, file_relation = file }); db.SaveChanges(); }

    Read the article

  • Problem with insert Thai Language data to SQL Server 2008 field datatype text and show ????

    - by embarus
    Hello everyone I created MVC ASP.Net Web application and tried insert Thai language data to SQL Server 2008 database to field with data type text and then database store ?????? which is incorrect. For Html Page I user charset utf-8 However I tried to Encode string before insert data to database and change database field collation. These do not solve problem. I'm looking forward to your reply. Thanks, embarus

    Read the article

  • How do I use Django to insert a Geometry Field into the database?

    - by alex
    class LocationLog(models.Model): user = models.ForeignKey(User) utm = models.GeometryField(spatial_index=True) This is my database model. I would like to insert a row. I want to insert a circle at point -55, 333. With a radius of 10. How can I put this circle into the geometry field? Of course, then I would want to check which circles overlap a given circle. (my select statement)

    Read the article

  • how to insert an integer value from a Grid to Sql table?

    - by z-chaos
    Hi, I have a AdvWebGrid where the 7th coloumn is DynEdit where user will enter the value.Now I have to take the enetered value and insert into the Sql tablel. For example i have 7 records in the grid the user will enter some comments for the first three record and save.Now i have to insert/update the first three comments in the table.

    Read the article

< Previous Page | 84 85 86 87 88 89 90 91 92 93 94 95  | Next Page >