Search Results

Search found 8916 results on 357 pages for 'bulk insert'.

Page 17/357 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Last element not getting insert in Tree

    - by rdk1992
    So I was asked to make a Binary Tree in Haskell taking as input a list of Integers. Below is my code. My problem is that the last element of the list is not getting inserted in the Tree. For example [1,2,3,4] it only inserts to the tree until "3" and 4 is not inserted in the Tree. data ArbolBinario a = Node a (ArbolBinario a) (ArbolBinario a) | EmptyNode deriving(Show) insert(x) EmptyNode= insert(tail x) (Node (head x) EmptyNode EmptyNode) insert(x) (Node e izq der) |x == [] = EmptyNode --I added this line to fix the Prelude.Head Empty List error, after I added this line the last element started to be ignored and not inserted in the tree |head x == e = (Node e izq der) |head x < e = (Node e (insert x izq) der) |head x > e = (Node e izq (insert x der)) Any ideas on whats going on here? Help is much appreciated

    Read the article

  • Can't INSERT INTO SELECT into a table with identity column

    - by Eran Goldin
    In SQL server, I'm using a table variable and when done manipulating it I want to insert its values into a real table that has an identity column which is also the PK. The table variable I'm making has two columns; the physical table has four, the first of which is the identity column, an integer IK. The data types for the columns I want to insert are the same as the target columns' data types. INSERT INTO [dbo].[Message] ([Name], [Type]) SELECT DISTINCT [Code],[MessageType] FROM @TempTableVariable END This fails with Cannot insert duplicate key row in object 'dbo.Message' with unique index 'IX_Message_Id'. The duplicate key value is (ApplicationSelection). But when trying to insert just Values (...) it works ok. How do I get it right?

    Read the article

  • Salesforce/PHP - Bulk Outbound message (SOAP), Time out issue - See update #2

    - by Phill Pafford
    Salesforce can send up to 100 requests inside 1 SOAP message. While sending this type of Bulk Ooutbound message request my PHP script finishes executing but SF fails to accept the ACK used to clear the message queue on the Salesforce side of things. Looking at the Outbound message log (monitoring) I see all the messages in a pending state with the Delivery Failure Reason "java.net.SocketTimeoutException: Read timed out". If my script has finished execution, why do I get this error? I have tried these methods to increase the execution time on my server as I have no access on the Salesforce side: set_time_limit(0); // in the script max_execution_time = 360 ; Maximum execution time of each script, in seconds max_input_time = 360 ; Maximum amount of time each script may spend parsing request data memory_limit = 32M ; Maximum amount of memory a script may consume I used the high settings just for testing. Any thoughts as to why this is failing the ACK delivery back to Salesforce? Here is some of the code: This is how I accept and send the ACK file for the imcoming SOAP request $data = 'php://input'; $content = file_get_contents($data); if($content) { respond('true'); } else { respond('false'); } The respond function function respond($tf) { $ACK = <<<ACK <?xml version = "1.0" encoding = "utf-8"?> <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"> <soapenv:Body> <notifications xmlns="http://soap.sforce.com/2005/09/outbound"> <Ack>$tf</Ack> </notifications> </soapenv:Body> </soapenv:Envelope> ACK; print trim($ACK); } These are in a generic script that I include into the script that uses the data for a specific workflow. I can process about 25 requests (That are in 1 SOAP response) but once I go over that I get the timeout error in the Salesforce queue. for 50 requests is usually takes my PHP script 86.77 seconds. Could it be Apache? PHP? I have also tested just accepting the 100 request SOAP response and just accepting and sending the ACK the queue clears out, so I know it's on my side of things. I show no errors in the apache log, the script runs fine. I did find some info on the Salesforce site but still no luck. Here is the link. Also I'm using the PHP Toolkit 11 (From Salesforce). Other forum with good SF help Thanks for any insight into this, --Phill UPDATE: If I receive the incoming message and print the response, should this happen first regardless if I do anything else after? Or does it wait for my process to finish and then print the response? UPDATE #2: okay I think I have the problem: PHP uses the single thread processing approach and will not send back the ACK file until the thread has completed it's processing. Is there a way to make this a mutli thread process? Thread #1 - accept the incoming SOAP request and send back the ACK Thread #2 - Process the SOAP request I know I could break it up into like a DB table or flat file, but is there a way to accomplish this without doing that? I'm going to try to close the socket after the ACK submission and continue the processing, cross my fingers it will work.

    Read the article

  • PostgreSQL insert on primary key failing with contention, even at serializable level

    - by Steven Schlansker
    I'm trying to insert or update data in a PostgreSQL db. The simplest case is a key-value pairing (the actual data is more complicated, but this is the smallest clear example) When you set a value, I'd like it to insert if the key is not there, otherwise update. Sadly Postgres does not have an insert or update statement, so I have to emulate it myself. I've been working with the idea of basically SELECTing whether the key exists, and then running the appropriate INSERT or UPDATE. Now clearly this needs to be be in a transaction or all manner of bad things could happen. However, this is not working exactly how I'd like it to - I understand that there are limitations to serializable transactions, but I'm not sure how to work around this one. Here's the situation - ab: => set transaction isolation level serializable; a: => select count(1) from table where id=1; --> 0 b: => select count(1) from table where id=1; --> 0 a: => insert into table values(1); --> 1 b: => insert into table values(1); --> ERROR: duplicate key value violates unique constraint "serial_test_pkey" Now I would expect it to throw the usual "couldn't commit due to concurrent update" but I'm guessing since the inserts are different "rows" this does not happen. Is there an easy way to work around this?

    Read the article

  • Using OUTPUT/INTO within instead of insert trigger invalidates 'inserted' table

    - by Dan
    I have a problem using a table with an instead of insert trigger. The table I created contains an identity column. I need to use an instead of insert trigger on this table. I also need to see the value of the newly inserted identity from within my trigger which requires the use of OUTPUT/INTO within the trigger. The problem is then that clients that perform INSERTs cannot see the inserted values. For example, I create a simple table: CREATE TABLE [MyTable]( [MyID] [int] IDENTITY(1,1) NOT NULL, [MyBit] [bit] NOT NULL, CONSTRAINT [PK_MyTable_MyID] PRIMARY KEY NONCLUSTERED ( [MyID] ASC )) Next I create a simple instead of trigger: create trigger [trMyTableInsert] on [MyTable] instead of insert as BEGIN DECLARE @InsertedRows table( MyID int, MyBit bit); INSERT INTO [MyTable] ([MyBit]) OUTPUT inserted.MyID, inserted.MyBit INTO @InsertedRows SELECT inserted.MyBit FROM inserted; -- LOGIC NOT SHOWN HERE THAT USES @InsertedRows END; Lastly, I attempt to perform an insert and retrieve the inserted values: DECLARE @tbl TABLE (myID INT) insert into MyTable (MyBit) OUTPUT inserted.MyID INTO @tbl VALUES (1) SELECT * from @tbl The issue is all I ever get back is zero. I can see the row was correctly inserted into the table. I also know that if I remove the OUTPUT/INTO from within the trigger this problem goes away. Any thoughts as to what I'm doing wrong? Or is how I want to do things not feasible? Thanks.

    Read the article

  • SQL Server INSERT ... SELECT Statement won't parse

    - by Jim Barnett
    I am getting the following error message with SQL Server 2005 Msg 120, Level 15, State 1, Procedure usp_AttributeActivitiesForDateRange, Line 18 The select list for the INSERT statement contains fewer items than the insert list. The number of SELECT values must match the number of INSERT columns. I have copy and pasted the select list and insert list into excel and verified there are the same number of items in each list. Both tables an additional primary key field with is not listed in either the insert statement or select list. I am not sure if that is relevant, but suspicious it may be. Here is the source for my stored procedure: CREATE PROCEDURE [dbo].[usp_AttributeActivitiesForDateRange] ( @dtmFrom DATETIME, @dtmTo DATETIME ) AS BEGIN SET NOCOUNT ON; DECLARE @dtmToWithTime DATETIME SET @dtmToWithTime = DATEADD(hh, 23, DATEADD(mi, 59, DATEADD(s, 59, @dtmTo))); -- Get uncontested DC activities INSERT INTO AttributedDoubleClickActivities ([Time], [User-ID], [IP], [Advertiser-ID], [Buy-ID], [Ad-ID], [Ad-Jumpto], [Creative-ID], [Creative-Version], [Creative-Size-ID], [Site-ID], [Page-ID], [Country-ID], [State Province], [Areacode], [OS-ID], [Domain-ID], [Keyword], [Local-User-ID], [Activity-Type], [Activity-Sub-Type], [Quantity], [Revenue], [Transaction-ID], [Other-Data], Ordinal, [Click-Time], [Event-ID]) SELECT [Time], [User-ID], [IP], [Advertiser-ID], [Buy-ID], [Ad-ID], [Ad-Jumpto], [Creative-ID], [Creative-Version], [Creative-Size-ID], [Site-ID], [Page-ID], [Country-ID], [State Province], [Areacode], [OS-ID], [Domain-ID], [Keyword], [Local-User-ID] [Activity-Type], [Activity-Sub-Type], [Quantity], [Revenue], [Transaction-ID], [Other-Data], REPLACE(Ordinal, '?', '') AS Ordinal, [Click-Time], [Event-ID] FROM Activity_Reports WHERE [Time] BETWEEN @dtmFrom AND @dtmTo AND REPLACE(Ordinal, '?', '') IN (SELECT REPLACE(Ordinal, '?', '') FROM Activity_Reports WHERE [Time] BETWEEN @dtmFrom AND @dtmTo EXCEPT SELECT CONVERT(VARCHAR, TripID) FROM VisualSciencesActivities WHERE [Time] BETWEEN @dtmFrom AND @dtmTo); END GO

    Read the article

  • Not able to insert data in the database from a form in php

    - by Prashant Baid
    I am not able to insert data into my data, i dont know what the problem is. Here is the code: mysql_select_db("mitestore", $con); */ if ((isset($_POST['product_name'])) && (strlen(trim($_POST['product_name'])) 0)) { $product_name = stripslashes(strip_tags($_POST['product_name'])); $sql="INSERT INTO sell (product_name) VALUE ('$_POST[product_name]')"; } else {$product_name = 'Please enter the product name.';} if ((isset($_POST[''])) && (strlen(trim($_POST['how_old'])) 0)) { $how_old = stripslashes(strip_tags($_POST['how_old'])); $sql="INSERT INTO sell (how_old) VALUE ('$_POST[how_old]')"; } else {$how_old = 'Please enter how old your product is';} if ((isset($_POST['which_block'])) && (strlen(trim($_POST['which_block'])) 0)) { $which_block = stripslashes(strip_tags($_POST['which_block'])); $sql="INSERT INTO sell (which_block) VALUE ('$_POST[which_block]')"; } else {$which_block = 'Please enter which block are you from';} if ((isset($_POST['room_no'])) && (strlen(trim($_POST['room_no'])) 0)) { $room_no = stripslashes(strip_tags($_POST['room_no'])); $sql="INSERT INTO sell (room_no) VALUE ('$_POST[room_no]')"; } else {$room_no = 'Please enter the room no:';} if (!mysql_query($sql,$con)) { die('Error: ' . mysql_error()); } echo "Success!"; mysql_close($con) ? Initially i had this code and it worked for me. mysql_select_db("database", $con); $sql="INSERT INTO sell ( product_name, how_old , selling_price, negotiable, which_block, room_no) VALUES ('$_POST[product_name]','$_POST[how_old]','$_POST[selling_price]','$_POST[negotiable]','$_POST[which_block]','$_POST[room_no]')"; if (!mysql_query($sql,$con)) { die('Error: ' . mysql_error()); } echo "Your product is added."; mysql_close($con) ? But i don't know how to validate each field individually.

    Read the article

  • F# insert/remove item from list

    - by Timothy
    How should I go about removing a given element from a list? As an example, say I have list ['A'; 'B'; 'C'; 'D'; 'E'] and want to remove the element at index 2 to produce the list ['A'; 'B'; 'D'; 'E']? I've already written the following code which accomplishes the task, but it seems rather inefficient to traverse the start of the list when I already know the index. let remove lst i = let rec remove lst lst' = match lst with | [] -> lst' | h::t -> if List.length lst = i then lst' @ t else remove t (lst' @ [h]) remove lst [] let myList = ['A'; 'B'; 'C'; 'D'; 'E'] let newList = remove myList 2 Alternatively, how should I insert an element at a given position? My code is similar to the above approach and most likely inefficient as well. let insert lst i x = let rec insert lst lst' = match lst with | [] -> lst' | h::t -> if List.length lst = i then lst' @ [x] @ lst else insert t (lst' @ [h]) insert lst [] let myList = ['A'; 'B'; 'D'; 'E'] let newList = insert myList 2 'C'

    Read the article

  • insert newline in perl -e statement

    - by lydonchandra
    Hi If I do this in bash perl -e '$x; $y' How can I insert a new line between the character ; and $y? I don't want to re-type the whole line, I just want to move my cursor to the position and then insert a newline ? Is this possible? Many thanks

    Read the article

  • Data take on with Drupal 6

    - by Robert MacLean
    We are migrating our current intranet to Drupal 6 and there is a lot of data within the current system which can be classified into: List data, general lists of fields. Common use is phone list of the employees phone numbers. Document repository. Just basically a web version of a file share for documents. I can easily get the data + meta infomation out, but how do I bulk upload the two types of data into Drupal, as uploading the hundred of thousands of items manually is just not acceptable.

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • meteor mongodb _id changing after insert (and UUID property as well)

    - by lommaj
    I have meteor method that does an insert. Im using Regulate.js for form validation. I set the game_id field to Meteor.uuid() to create a unique value that I also route to /game_show/:game_id using iron router. As you can see I'm logging the details of the game, this works fine. (image link to log below) Meteor.methods({ create_game_form : function(data){ Regulate.create_game_form.validate(data, function (error, data) { if (error) { console.log('Server side validation failed.'); } else { console.log('Server side validation passed!'); // Save data to database or whatever... //console.log(data[0].value); var new_game = { game_id: Meteor.uuid(), name : data[0].value, game_type: data[1].value, creator_user_id: Meteor.userId(), user_name: Meteor.user().profile.name, created: new Date() }; console.log("NEW GAME BEFORE INSERT: ", new_game); GamesData.insert(new_game, function(error, new_id){ console.log("GAMES NEW MONGO ID: ", new_id) var game_data = GamesData.findOne({_id: new_id}); console.log('NEW GAME AFTER INSERT: ', game_data); Session.set('CURRENT_GAME', game_data); }); } }); } }); All of the data coming out of the console.log at this point works fine After this method call the client routes to /game_show/:game_id Meteor.call('create_game_form', data, function(error){ if(error){ return alert(error.reason); } //console.log("post insert data for routing variable " ,data); var created_game = Session.get('CURRENT_GAME'); console.log("Session Game ", created_game); Router.go('game_show', {game_id: created_game.game_id}); }); On this view, I try to load the document with the game_id I just inserted Template.game_start.helpers({ game_info: function(){ console.log(this.game_id); var game_data = GamesData.find({game_id: this.game_id}); console.log("trying to load via UUID ", game_data); return game_data; } }); sorry cant upload images... :-( https://www.evernote.com/shard/s21/sh/c07e8047-de93-4d08-9dc7-dae51668bdec/a8baf89a09e55f8902549e79f136fd45 As you can see from the image of the console log below, everything matches the id logged before insert the id logged in the insert callback using findOne() the id passed in the url However the mongo ID and the UUID I inserted ARE NOT THERE, the only document in there has all the other fields matching except those two! Not sure what im doing wrong. Thanks!

    Read the article

  • CommandName issues to insert to a database

    - by Alexander
    In the below code, when we define the parameters CommandName="Insert" is it actually the same as executing the method Insert? As I can't find Insert anywhere... <div class="actionbuttons"> <Club:RolloverButton ID="apply1" CommandName="Insert" Text="Add Event" runat="server" /> <Club:RolloverLink ID="Cancel" Text="Cancel" runat="server" NavigateURL='<%# "Events_view.aspx?EventID=" + Convert.ToString(Eval("ID")) %>' /> </div> I have the following SqlDataSource as well: <asp:SqlDataSource ID="SqlDataSource1" runat="server" ConnectionString="<%$ ConnectionStrings:ClubSiteDB %>" SelectCommand="SELECT dbo.Events.id, dbo.Events.starttime, dbo.events.endtime, dbo.Events.title, dbo.Events.description, dbo.Events.staticURL, dbo.Events.photo, dbo.Events.location, dbo.Locations.title AS locationname FROM dbo.Events LEFT OUTER JOIN dbo.Locations ON dbo.Events.location = dbo.Locations.id where Events.id=@id" InsertCommand="INSERT INTO Events(starttime, endtime, title, description, staticURL, location, photo) VALUES (@starttime, @endtime, @title, @description, @staticURL, @location, @photo)" UpdateCommand="UPDATE Events SET starttime = @starttime, endtime=@endtime, title = @title, description = @description, staticURL = @staticURL, location = @location, photo = @photo WHERE (id = @id)" DeleteCommand="DELETE Events WHERE id=@id" OldValuesParameterFormatString="{0}"> <SelectParameters> <asp:QueryStringParameter Name="id" QueryStringField="ID" /> </SelectParameters> <UpdateParameters> <asp:Parameter Name="starttime" Type="DateTime" /> <asp:Parameter Name="endtime" Type="DateTime" /> <asp:Parameter Name="title" /> <asp:Parameter Name="description" /> <asp:Parameter Name="staticURL" /> <asp:Parameter Name="location" /> <asp:Parameter Name="photo" /> <asp:Parameter Name="id" /> </UpdateParameters> <InsertParameters> <asp:Parameter Name="starttime" Type="DateTime" /> <asp:Parameter Name="endtime" Type="DateTime" /> <asp:Parameter Name="title" /> <asp:Parameter Name="description" /> <asp:Parameter Name="staticURL" /> <asp:Parameter Name="location" /> <asp:Parameter Name="photo" /> <asp:Parameter Name="id" /> </InsertParameters> <DeleteParameters> <asp:QueryStringParameter Name="id" QueryStringField="ID" /> </DeleteParameters> </asp:SqlDataSource> I want it to insert using the InsertCommand, however when I do SqlDataSource1.Insert() it's complaining that starttime is NULL

    Read the article

  • Bulk update & occasional insert (coredata) - Too slow

    - by Andrew
    Update: Currently looking into NSSET's minusSet links: http://stackoverflow.com/questions/1475636/comparing-two-arrays Hi guys, Could benefit from your wisdom here.. I'm using Coredata in my app, on first launch I download a data file and insert over 500 objects (each with 60 attributes) - fast, no problem. Each subsequent launch I download an updated version of the file, from which I need to update all existing objects' attributes (except maybe 5 attributes) and create new ones for items which have been added to the downloaded file. So, first launch I get 500 objects.. say a week later my file now contains 507 items.. I create two arrays, one for existing and one for downloaded. NSArray *peopleArrayDownloaded = [CoreDataHelper getObjectsFromContext:@"person" :@"person_id" :YES :managedObjectContextPeopleTemp]; NSArray *peopleArrayExisting = [CoreDataHelper getObjectsFromContext:@"person" :@"person_id" :YES :managedObjectContextPeople]; If the count of each array is equal then I just do this: NSUInteger index = 0; if ([peopleArrayExisting count] == [peopleArrayDownloaded count]) { NSLog(@"Number of people downloaded is same as the number of people existing"); for (person *existingPerson in peopleArrayExisting) { person *tempPerson = [peopleArrayDownloaded objectAtIndex:index]; // NSLog(@"Updating id: %@ with id: %@",existingPerson.person_id,tempPerson.person_id); // I have 60 attributes which I to update on each object, is there a quicker way other than overwriting existing? index++; } } else { NSLog(@"Number of people downloaded is different to number of players existing"); So now comes the slow part. I end up using this (which is tooooo slow): NSLog(@"Need people added to the league"); for (person *tempPerson in peopeArrayDownloaded) { NSPredicate *predicate = [NSPredicate predicateWithFormat:@"person_id = %@",tempPerson.person_id]; // NSLog(@"Searching for existing person, person_id: %@",existingPerson.person_id); NSArray *filteredArray = [peopleArrayExisting filteredArrayUsingPredicate:predicate]; if ([filteredArray count] == 0) { NSLog(@"Couldn't find an existing person in the downloaded file. Adding.."); person *newPerson = [NSEntityDescription insertNewObjectForEntityForName:@"person" inManagedObjectContext:managedObjectContextPeople]; Is there a way to generate a new array of index items referring to the additional items in my downloaded file? Incidentally, on my tableViews I'm using NSFetchedResultsController so updating attributes will call [cell setNeedsDisplay]; .. about 60 times per cell, not a good thing and it can crash the app. Thanks for reading :)

    Read the article

  • [Word2007] How to showing "only number" in picture cross-reference

    - by kornelijepetak
    I have many pictures in a document and I reference them very often in text. I don't want to lose the order so I am using Insert - Cross-reference. This opens the cross-reference dialog where I can set Reference type to Picture. For "Insert reference to", there are 5 choices: - Entire caption - List item - Only label and number - Only caption text - Page number, Above/below What I need is a reference that would be inserted like this: [4], and not like this: [Picture 4]; None of these options enable me to do it. Is there any way to make Word 2007 insert a reference to only Caption Number? Note: The document is written in Croatian language which has 7 declension cases, so using "Picture 4" would not be valid in all cases. Actually caption label Picture is set to croatian word "Slika" and when I need to say say "in the picture" I can't because it would be "na Slici 5." and not "na Slika 5." (like Word would make me do). That's why I need to reference only the caption number. Is that possible in Word 2007?

    Read the article

  • How to show "only number" in picture cross-reference in Word 2007 document?

    - by kornelijepetak
    I have many pictures in a document and I reference them very often in text. I don't want to lose the order so I am using Insert - Cross-reference. This opens the cross-reference dialog where I can set Reference type to Picture. For "Insert reference to", there are 5 choices: - Entire caption - Only label and number - Only caption text - Page number - Above/below What I need is a reference that would be inserted like this: [4], and not like this: [Picture 4]; None of these options enable me to do it. Is there any way to make Word 2007 insert a reference to only Caption Number? Note: The document is written in Croatian language which has 7 declension cases, so using "Picture 4" would not be valid in all cases. Actually caption label Picture is set to croatian word "Slika" and when I need to say say "in the picture" I can't because it would be "na Slici 5." and not "na Slika 5." (like Word would make me do). That's why I need to reference only the caption number. Is that possible in Word 2007?

    Read the article

  • Prevent Outlook 2010 Insert Picture resizing image

    - by Rup
    When I "Insert Picture" a JPEG in Outlook 2010 it automatically resizes the image and, I think, recompresses it too. I realise this would be useful for photographs or for people who try to email 1MB BMPs but I would like to email around an image at the original pixel size without recompression. Is there a way to turn this off, or better still choose settings for each image insert? I found this page in the Office help. It's for Word, PowerPoint and Excel not Outlook but points you at File, Options, Advanced, Image Settings. There's no equivalent section in Outlook. I know Outlook uses Word as its editor so I've looked at Word's settings but there isn't an 'original size' here: there's only 'turn off image recompression' and pick target DPI from 96, 150, 220. I guess Office is finding a DPI value in the JPEG file and scaling it up or down to match this setting. I can't find an equivalent option in Outlook's options menu but there's so many settings and pop-up dialogs I may have missed something. Picture Format, Reset image size resets the image to the rescaled version, not the original. I can't see a way to edit a pixel value into size values in the image properties after insert. Thanks! I realise I can probably achieve this by editing the image metadata in PhotoShop elements or similar but there ought to be a way without editing the file? This is new behaviour in Outlook 2010; 2007 didn't do this.

    Read the article

  • postfix revived and delivered have the same values (?)

    - by thinkingbig
    I have configured my first server (Debian with ISPConfig). Generally i want to send bulk e-mails to our users, i configure postfix and turn on postfix... but... After 1 hour of sending emails i have logs like this: Grand Totals messages 21886 received 21883 delivered 0 forwarded 0 deferred 234 bounced 0 rejected (0%) 0 reject warnings 0 held 0 discarded (0%) 30805k bytes received 31280k bytes delivered 3 senders 3 sending hosts/domains 12588 recipients 3 recipient hosts/domains Per-Hour Traffic Summary time received delivered deferred bounced rejected -------------------------------------------------------------------- 0000-0100 0 0 0 0 0 0100-0200 0 0 0 0 0 0200-0300 0 0 0 0 0 0300-0400 0 0 0 0 0 0400-0500 0 0 0 0 0 0500-0600 0 0 0 0 0 0600-0700 0 0 0 0 0 0700-0800 0 0 0 0 0 0800-0900 0 0 0 0 0 0900-1000 0 0 0 0 0 1000-1100 0 0 0 0 0 1100-1200 0 0 0 0 0 1200-1300 0 0 0 0 0 1300-1400 0 0 0 0 0 1400-1500 0 0 0 0 0 1500-1600 15311 15306 0 168 0 1600-1700 6575 6577 0 66 0 1700-1800 0 0 0 0 0 1800-1900 0 0 0 0 0 1900-2000 0 0 0 0 0 2000-2100 0 0 0 0 0 2100-2200 0 0 0 0 0 2200-2300 0 0 0 0 0 2300-2400 0 0 0 0 0 Host/Domain Summary: Message Delivery sent cnt bytes defers avg dly max dly host/domain 21521 30353k 0 3.4 m 15.5 m wp.pl 355 919k 0 54.9 s 13.0 m mysenderdomainexample.pl 7 8477 0 1.7 s 1.9 s prokonto.pl Host/Domain Summary: Messages Received msg cnt bytes host/domain 21879 30786k mysenderdomainexample.pl 5 16196 mx4.wp.pl 1 3200 mx3.wp.pl Senders by message count 21783 [email protected] 96 [email protected] 6 from=< **So, my question is: 1) Why i have recived and delivered have the same values (approx)? 2) How can I check if an email has been delivered? 3) How to change default "root" and "www-data" user (FROM / RETURN PATH) to another? I have changed this in script, but postfix ignore scripting values and send every mail from root (we have .php send cron's in /etc/crontab) 4) WHY APPROX 100 % MAILS RECIVED HAS BEEN ADRESED TO MY SENDER HOST? Host/Domain Summary: Messages Received Waiting for respond, Regards TB**

    Read the article

  • The Best solution to sending emails to group members in a social network.

    - by praveen
    I'm developing a social network in php. There is a feature to create custom groups by members and i want to enable the owner of the group to send emails in bulk to all group members. And most importantly i want to make sure that most mails are not going to end up in users' SPAM list. I'm using PHPMailer in my website What should be the best and affordable approach? These are the solutions i came up with. Pls include your own if possible. Using my servers' (hostgator.com) given SMTP server to send emails in bulk Using google apps SMTP (i have even setup SPF for this) Using a third-party to handle my outgoing messages (i.e: mailChimp, iContact ? do they provide APIs ? this is just my idea) Thanks in advance

    Read the article

  • DB2 insert performance - How to measure

    - by svrist
    [From stackoverflow] Im trying to find a way to speedup my inserts to a DB2 9.7.1 (ubuntu linux) Im watching vmstat and trying to gather some statistics via the db2 get snapshot commands but im not able to figure out which numbers im looking for to be able to see where the trouble is. I've read lits of stuff like http://www.eggheadcafe.com/software/aspnet/35692526/question-multiple-row-in.aspx, and http://www.ibm.com/developerworks/data/library/tips/dm-0403wilkins/ and tricks like ALTER TABLE lalala APPEND ON works somewhat (the difference between a dd if=/dev/zero and insert is still a factor 10) but I would like to be able to find the counters or other performance indicators that actually show why it makes sense to use those tricks. For example: What is the metric called that shows me that it is buffer pages allocation (FSCR stuff) that is the problem Where do I see that the insert time is hampered by clustered indexes? I find db2top very useful but im still searching for more direct view of "this is your bottleneck" methods

    Read the article

  • Algorithm to price bulk discounts

    - by sam munkes
    Hi, i am designing a Chinese auction website. Tickets ($5, $10 & $20) are sold either individually, or via packages to receive discounts. There are various Ticket packages for example: 5-$5 tickets = receive 10% off 5-$10 tickets = receive 10% off 5-$20 tickets = receive 10% off 5-$5 tickets + 5-$10 tickets + 5-$20 tickets = receive 15% off When users add tickets to their cart, i need to figure out the cheapest package(s) to give them. the trick is that if a user adds 4-$5 tickets + 5-$10 tickets + 5-$20 tickets, it should still give him package #4 since that would be the cheapest for him. Any help in figuring out a algorithm to solve this, or any tips would be greatly appreciate it. thanks

    Read the article

  • WiX 3: Using heat.exe to add bulk files to a new WiX project: HEAT5150

    - by Karen Kwong
    If this is a repeat question, please direct me to the existing solution. I wasn't able to find a matching query. We currently use InstallShield. I'm attempting to covert a project with 407 files to a WiX3 installation package. I tried using heat.exe to do some of the automation but I get the following warning for almost every file: c: heat dir "c:\projectDir\projectA" -gg -ke -template:Product -out "c:\install\projectA\heatOutput" heat.exe: warning HEAT5150 : Could not harvest data from a file that was expected to be a SelfReg DLL: c:\projectDir\projectA\plugin1.dll. If this file does not support SelfReg you can ignore this warning. Otherwise, this error detail may be helpful to diagnose the failure: Unable to load file: c:\projectDir\projectA\plugin1.dll, error: 126. Q: Is it normal for this warning to be reported for every file? If there's a current "How To create/convert to your first WiX install project with many files" tutorial, please point me to it. The key requirement is "with many files". Thank-you -Karen Kwong- PS. I know that WiX is designed for incremental install project creation but it would be nice to know if there's an automated way to convert existing install projects.

    Read the article

  • Get non-overlapping dates ranges for prices history data

    - by Anonymouse
    Hello, Let's assume that I have the following table: CREATE TABLE [dbo].[PricesHist]( [Product] varchar NOT NULL, [Price] [float] NOT NULL, [StartDate] [datetime] NOT NULL, [EndDate] [datetime] NOT NULL ) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D2C00000000 AS DateTime), CAST(0x00009D2C00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D2D00000000 AS DateTime), CAST(0x00009D2D00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 2.5, CAST(0x00009D2E00000000 AS DateTime), CAST(0x00009D2E00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3000000000 AS DateTime), CAST(0x00009D3000000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3100000000 AS DateTime), CAST(0x00009D3100000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3400000000 AS DateTime), CAST(0x00009D3400000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 2.5, CAST(0x00009D3500000000 AS DateTime), CAST(0x00009D3500000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3600000000 AS DateTime), CAST(0x00009D3600000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3700000000 AS DateTime), CAST(0x00009D3700000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3800000000 AS DateTime), CAST(0x00009D3800000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3A00000000 AS DateTime), CAST(0x00009D3A00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3B00000000 AS DateTime), CAST(0x00009D3B00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 2.5, CAST(0x00009D3C00000000 AS DateTime), CAST(0x00009D3C00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3D00000000 AS DateTime), CAST(0x00009D3D00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3E00000000 AS DateTime), CAST(0x00009D3E00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D3F00000000 AS DateTime), CAST(0x00009D3F00000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D4100000000 AS DateTime), CAST(0x00009D4100000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D4200000000 AS DateTime), CAST(0x00009D4200000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 2.5, CAST(0x00009D4300000000 AS DateTime), CAST(0x00009D4300000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D4400000000 AS DateTime), CAST(0x00009D4400000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D4500000000 AS DateTime), CAST(0x00009D4500000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D4600000000 AS DateTime), CAST(0x00009D4600000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 4.9, CAST(0x00009D4800000000 AS DateTime), CAST(0x00009D4800000000 AS DateTime)) INSERT [dbo].[PricesHist] ([Product], [Price], [StartDate], [EndDate]) VALUES (N'Apples', 2.5, CAST(0x00009D4A00000000 AS DateTime), CAST(0x00009D4A00000000 AS DateTime)) As you can see, there are two prices on that month for Apples. 4.90 and 2.50. In order to tidy this table up, I need to get this information as a date range rather than a row per day as it currently is. I can obviously do this with Min and Max aggregates easily but the ranges overlap and other business code expect non-overlapping ranges. I also tried to achieve this with self joins and row_number(), but without much success... Here is what I'm trying to achieve as the output: Product | StartDate | EndDate | Price ------------------------------------------- Apples | 01 Mar 2010 | 02 Mar 2010 | 4.90 Apples | 03 Mar 2010 | 03 Mar 2010 | 2.50 Apples | 05 Mar 2010 | 09 Mar 2010 | 4.90 Apples | 10 Mar 2010 | 10 Mar 2010 | 2.50 Apples | 11 Mar 2010 | 16 Mar 2010 | 4.90 Apples | 17 Mar 2010 | 17 Mar 2010 | 2.50 Apples | 18 Mar 2010 | 23 Mar 2010 | 4.90 Apples | 24 Mar 2010 | 24 Mar 2010 | 2.50 Apples | 25 Mar 2010 | 30 Mar 2010 | 4.90 Apples | 31 Mar 2010 | 31 Mar 2010 | 2.50 What would please be the best approach to get this done? Thanks a lot in advance,

    Read the article

  • Bulk email sender and list management API

    - by goombaloon
    I'm looking for a way to programmatically manage email lists and send large numbers of emails to these lists. The general idea would be similar to how social networking sites send email notifications when a friend posts new content. Since a user could have tens of thousands of followers, it doesn't seem realistic to send these directly from my application so I'm curious if there's a good way to handle this through a service provider. My company is using Google apps but I'm not an expert on their API. Our environment is ASP.NET MVC. Any advice is most appreciated!

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >