Search Results

Search found 811 results on 33 pages for 'bulk'.

Page 8/33 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert into a DBMS?

    - by Mahmoud Abdelkader
    Please bear with me as I explain the problem, how I tried to solve it, and my question on how to improve it is at the end. I have a 100,000 line csv file from an offline batch job and I needed to insert it into the database as its proper models. Ordinarily, if this is a fairly straight-forward load, this can be trivially loaded by just munging the CSV file to fit a schema, but I had to do some external processing that requires querying and it's just much more convenient to use SQLAlchemy to generate the data I want. The data I want here is 3 models that represent 3 pre-exiting tables in the database and each subsequent model depends on the previous model. For example: Model C --> Foreign Key --> Model B --> Foreign Key --> Model A So, the models must be inserted in the order A, B, and C. I came up with a producer/consumer approach: - instantiate a multiprocessing.Process which contains a threadpool of 50 persister threads that have a threadlocal connection to a database - read a line from the file using the csv DictReader - enqueue the dictionary to the process, where each thread creates the appropriate models by querying the right values and each thread persists the models in the appropriate order This was faster than a non-threaded read/persist but it is way slower than bulk-loading a file into the database. The job finished persisting after about 45 minutes. For fun, I decided to write it in SQL statements, it took 5 minutes. Writing the SQL statements took me a couple of hours, though. So my question is, could I have used a faster method to insert rows using SQLAlchemy? As I understand it, SQLAlchemy is not designed for bulk insert operations, so this is less than ideal. This follows to my question, is there a way to generate the SQL statements using SQLAlchemy, throw them in a file, and then just use a bulk-load into the database? I know about str(model_object) but it does not show the interpolated values. I would appreciate any guidance for how to do this faster. Thanks!

    Read the article

  • Paid service that will bulk convert HTML and text to PDF?

    - by SCS
    We have a repository of documents in HTML and text that we're looking to get into PDF format, but my boss (another developer) feels that writing this portion of the application will detract from the main development that we're doing at the moment. Is anyone aware of any services that will be able to do bulk conversions of these documents to PDF? Any services that will also provide thumbnails for the documents is something we're looking for as well.

    Read the article

  • error -4861 - I do a BULK INSERT the contents of a csv file into a table through stored procedure

    - by Chandru
    Hi, Could you please help me,I have an application, in which I do a BULK INSERT the contents of a csv file into a table through stored procedure, the stored Procedure uses BULK INSERT (SQL Server 2005). This Works fine in a standalone system. However when I use the same in a multitier (Web server, Application Server and DB Server) architecture it is throwing 4861 error. Could you please help? The Files are stored in the Web Server. The Translated error message below: " Error – 2147217900:4861: Since it can not be opened for the file \Servername\c$\Folder1\Folder2\Folder3\file.csv, It can not be loaded with large capacity. Operation system error code is 5 (error not ……….) " Thanks Regards, Chandru

    Read the article

  • Authentication Failed exception - In the middle of bulk mail sending code.

    - by Ezhil
    We have a thread program that sends bulk mail. The information like 1. To 2. Subject Etc., are fetched from database, mail is composed and pushed to SMTP server. One of our customer sent a bulk mail with 2390 email. After sending 40 emails, suddenly the following exception occurred EXCEPTION javax.mail.AuthenticationFailedException STACKTRACE javax.mail.Service.connect(Service.java:306) javax.mail.Service.connect(Service.java:156) javax.mail.Service.connect(Service.java:105) ............... java.lang.Thread.run(Thread.java:619) and the rest 2350 emails failed. Why does this occur? Thanks for the Suggestions and Help Ezhil

    Read the article

  • How to bulk mail-enable contacts from AD in Exchange 2007?

    - by George Hewitt
    Hello, We have several thousand 'contacts' setup in AD already for a faxing system. We're migrating to an online fax provider that uses e-mail rather than plain old telephone. So, we've bulk edited all the AD records so that the 'mail' attribute is populated with the right e-mail address in the right format. Now, how do we enable these contacts within Exchange 2007? I've looked through http://technet.microsoft.com/en-us/library/bb684891.aspx but that only seems to talk about manually editing the CSV output to specify the external addresses. AD already knows the external e-mail addresses - I just need the info in Exchange! Any thoughts?

    Read the article

  • How to bulk mail-enable contacts from AD in Exchange 2007?

    - by George Hewitt
    We have several thousand 'contacts' setup in AD already for a faxing system. We're migrating to an online fax provider that uses e-mail rather than plain old telephone. So, we've bulk edited all the AD records so that the 'mail' attribute is populated with the right e-mail address in the right format. Now, how do we enable these contacts within Exchange 2007? I've looked through http://technet.microsoft.com/en-us/library/bb684891.aspx but that only seems to talk about manually editing the CSV output to specify the external addresses. AD already knows the external e-mail addresses - I just need the info in Exchange! Any thoughts?

    Read the article

  • Implementing emailing (bulk & event based) features for my website.

    - by Kabeer
    Hello. For my upcoming social networking website, I am looking for suggestions on the best way to implement emailing. Here are some of my requirements and constraints: Requirements: - Should be able to send emails based on events (new registrations, change password, etc.), promotions (advertisements based on user consent), bulk mails (newsletters), reminders (profile updates), etc. I hope I got the point through. - Should be able to process faults (incorrect email address, mail-box full, etc) - User initiated invites (inviting friends to connect) Constraints: - As of now I am looking at Godaddy for hosting. Subsequently I shall move to, may be Amazon Cloud. Godaddy seems to be excruciatingly conservative (not bad always) when it comes to the ability to send email. - My tests on Godaddy so far have been discouraging. There is limit to no. of emails I can send and sometimes if emails carries special characters it throws strange exceptions like there was a virus affected attachment (even though I hadn't attached a thing). The replies from Godaddy support have been equally funny. My intent is not to portray Godaddy as wrong but I am looking for a work-around that frees me from said constraints. I am looking for a mechanism / service that is either free of very cost effective. I wonder how other sites address this. Mine is a .Net / Windows based application.

    Read the article

  • Cannot bulk load. The file "c:\data.txt" does not exist.

    - by Daniel Brink
    Hi, I'm having a problem reading data from a text file into ms sql. I created a text file in my c:\ called data.txt, but for some reason ms sql server cannot find the file. I get the error "Cannot bulk load. The file "c:\data.txt" does not exist." Any ideas? The data file (yes I know the data looks crappy, but in the real world thats how it comes from clients): 01-04 10.338,18 0,00 597.877,06- 5 0,7500 62,278- 06-04 91.773,00 9.949,83 679.700,23- 1 0,7500 14,160- 07-04 60.648,40 149.239,36 591.109,27- 1 0,7500 12,314- 08-04 220.173,70 213.804,37 597.478,60- 1 0,7500 12,447- 09-04 986.071,39 0,00 1.583.549,99- 3 0,7500 98,971- 12-04 836.049,00 1.325.234,79 1.094.364,20- 1 0,7500 22,799- 13-04 38.000,00 503.010,49 629.353,71- 1 0,7500 13,111- 14-04 286.400,00 840.126,50 75.627,21- 1 0,7500 1,575- The Sql: CREATE TABLE #temp ( vchCol1 VARCHAR (50), vchCol2 VARCHAR (50), vchCol3 VARCHAR (50), vchCol4 VARCHAR (50), vchCol5 VARCHAR (50), vchCol6 VARCHAR (50), vchCol7 VARCHAR (50) ) BULK insert #temp FROM 'c:\data.txt' WITH ( FIELDTERMINATOR = ' ', ROWTERMINATOR = '\n' ) select * from #temp drop table #temp

    Read the article

  • Is it possible to bulk load an NDB child Entity in GAE?

    - by hmacread
    At some point in the future I may need to bulk load migration data (i.e. from a CSV). Has anyone had exceptions raised doing the following? Also is there any change in behaviour if the ndb.put_multi() function is used? from google.appengine.ext import ndb while True: if not id: break id, name = read_csv_row(readline()) x = X(parent=ndb.Key('Y','static_id') x.id, x.name = id, name x.put() class X(ndb.Model): id = StringProperty() name = StringProperty() class Y(ndb.Model): pass def read_csv_row(line): """returns tuple"""

    Read the article

  • Laravel Passing variable not working

    - by Friend
    Hello People here is my code i have used in controller... public function bulk() { return View::make('bulk')->with('message','hii there'); } my route file contains... Route::get('bulk',array('uses'=>'HomeController@bulk'))->before('auth'); In my view Iam testing it by ... @if(Session::has('message')) Present @else not Present @endif The page is making a view with the message 'not Present' why is it?? I even tried return Redirect::to('bulk')->with('message','hii there'); I get an erro mesage on Console mypro/public/bulk net::ERR_TOO_MANY_REDIRECTS What could be the problem?? is there any issues with name?? I tried this method earlier which worked fine for me.... :( Iam using Blade Template..

    Read the article

  • How can I bulk rename files in a RAR or ZIP archive on the mac?

    - by Chris R
    I have a set of archive files -- both zip and rar formats -- inside of which I need to rename some files. Specifically, I want to do something like this: for each archive file in a directory for each file in the archive if the file name matches the regular expression /(.* - [0-9]{2})([0-9]{2} - .)*/ rename the file as \1-\2 The trick isn't so much in the generation of the new name; I can do that with either bash or sed or anything else. It's the set of commands to manipulate the files in the archives using rar/unrar or unzip/zip (If it makes a difference, I'm re-formatting some CBR/CBZ files to get the double-page spreads to come up in the right order in SimpleComic -- it interprets page 0203 as page 203, which makes the story a bit hard to follow)

    Read the article

  • If i make a mail server can i send bulk email?

    - by Jake Smith
    I work for a small company and we have fallen into the fad of "email campains" a.k.a Junk mail. So far the company has gotten a subscriber list from our website, and paid a good chunk of change for a emailer program. The problem is, Our list has close to 4,000 people on it and growing. with gmail only allowing 100 emails per account through on SMTP and I am on a tight budget so I cant hire anyone else. I was thinking of doing a dedicated mail server off of the website server we have running in the office. Is it possible? to make emails on your own server, and then send it through your own SMTP? if it is, what software would I need and is if free or low cost at least. We run a WAMP server, i set it up just for information, but i could switch it to lamp or whatever if need be. Thank you for your time and youre answers

    Read the article

  • Bulk inserting and updating with Entity Framework (Probably a better alternative?)

    - by Dave
    I have a data set of devices, addresses, and companies that I need to import into our database, with the catch that our database may already include a specific device/address/company that is included in the new data set. If that is the case, I need to update that entry with the new information in the data set, excluding addresses. We check if an exact copy of that address exists, otherwise we make a new entry. My issue is that it is very slow to attempt to grab a device/company in EF and if it exist updated it, otherwise insert it. To fix this I tried to get all the companies, devices, and addresses and insert them into respective hashmaps, and check if the identifier of the new data exists in the hashmap. This hasn't led to any performance increases. I've included my code below. Typically I would do a batch insert, I'm not sure what I would do for a batch update though. Can someone advise a different route? var context = ObjectContextHelper.CurrentObjectContext; var oldDevices = context.Devices; var companies = context.Companies; var addresses = context.Addresses; Dictionary<string, Company> companyMap = new Dictionary<string, Company>(StringComparer.OrdinalIgnoreCase); Dictionary<string, Device> deviceMap = new Dictionary<string, Device>(StringComparer.OrdinalIgnoreCase); Dictionary<string, Address> addressMap = new Dictionary<string, Address>(StringComparer.OrdinalIgnoreCase); foreach (Company c in companies) { if (c.CompanyAccountID != null && !companyMap.ContainsKey(c.CompanyAccountID)) companyMap.Add(c.CompanyAccountID, c); } foreach (Device d in oldDevices) { if (d.SerialNumber != null && !deviceMap.ContainsKey(d.SerialNumber)) deviceMap.Add(d.SerialNumber, d); } foreach (Address a in addresses) { string identifier = GetAddressIdentifier(a); if (!addressMap.ContainsKey(identifier)) addressMap.Add(identifier, a); } foreach (DeviceData.TabsDevice device in devices) { // update a device Company tempCompany; Address tempAddress; Device currentDevice; if (deviceMap.ContainsKey(device.SerialNumber)) //update a device deviceMap.TryGetValue(device.SerialNumber, out currentDevice); else // insert a new device currentDevice = new Device(); currentDevice.SerialNumber = device.SerialNumber; currentDevice.SerialNumberTABS = device.SerialNumberTabs; currentDevice.Model = device.Model; if (device.CustomerAccountID != null && device.CustomerAccountID != "") { companyMap.TryGetValue(device.CustomerAccountID, out tempCompany); currentDevice.CustomerID = tempCompany.CompanyID; currentDevice.CustomerName = tempCompany.CompanyName; } if (companyMap.TryGetValue(device.ServicingDealerAccountID, out tempCompany)) currentDevice.CompanyID = tempCompany.CompanyID; currentDevice.StatusID = 1; currentDevice.Retries = 0; currentDevice.ControllerFamilyID = 1; if (currentDevice.EWBFrontPanelMsgOption == null) // set the Panel option to the default if it isn't set already currentDevice.EWBFrontPanelMsgOption = context.EWBFrontPanelMsgOptions.Where( i => i.OptionDescription.Contains("default")).Single(); // link the device to the existing address as long as it is actually an address if (addressMap.TryGetValue(GetAddressIdentifier(device.address), out tempAddress)) { if (GetAddressIdentifier(device.address) != "") currentDevice.Address = tempAddress; else currentDevice.Address = null; } else // insert a new Address and link the device to it (if not null) { if (GetAddressIdentifier(device.address) == "") currentDevice.Address = null; else { tempAddress = new Address(); tempAddress.Address1 = device.address.Address1; tempAddress.Address2 = device.address.Address2; tempAddress.Address3 = device.address.Address3; tempAddress.Address4 = device.address.Address4; tempAddress.City = device.address.City; tempAddress.Country = device.address.Country; tempAddress.PostalCode = device.address.PostalCode; tempAddress.State = device.address.State; addresses.AddObject(tempAddress); addressMap.Add(GetAddressIdentifier(tempAddress), tempAddress); currentDevice.Address = tempAddress; } } if (!deviceMap.ContainsKey(device.SerialNumber)) // if inserting, add to context { oldDevices.AddObject(currentDevice); deviceMap.Add(device.SerialNumber, currentDevice); } } context.SaveChanges();

    Read the article

  • Why is my django bulk database population so slow and frequently failing?

    - by bryn
    I decided I'd like to use django's model system rather than coding raw SQL to interface with my database, but I am having a problem that surely is avoidable. My models.py contains: class Student(models.Model): student_id = models.IntegerField(unique = True) form = models.CharField(max_length = 10) preferred = models.CharField(max_length = 70) surname = models.CharField(max_length = 70) and I'm populating it by looping through a list as follows: from models import Student for id, frm, pref, sname in large_list_of_data: s = Student(student_id = id, form = frm, preferred = pref, surname = sname) s.save() I don't really want to be saving this to the database each time but I don't know another way to get django to not forget about it (I'd rather add all the rows and then do a single commit). There are two problems with the code as it stands. It's slow -- about 20 students get updated each second. It doesn't even make it through large_list_of_data, instead throwing a DatabaseError saying "unable to open database file". (Possibly because I'm using sqlite3.) My question is: How can I stop these two things from happening? I'm guessing that the root of both problems is that I've got the s.save() but I don't see a way of easily batching the students up and then saving them in one commit to the database.

    Read the article

  • Questions About SQl BulkCopy

    - by chobo2
    Hi I am wondering how can do a mass insert and bulk copy at the same time? I have 2 tables that should be affect by the bulk copy as they both depend on each other. So I want it that if while inserting table 1 a record dies it gets rolled back and table 2 never gets updated. Also if table 1 inserts good and table 2 an update fails table 1 gets rolled back. Can this be done with bulk copy?

    Read the article

  • SQLAuthority News – SQL Server Technical Article – The Data Loading Performance Guide

    - by pinaldave
    The white paper describes load strategies for achieving high-speed data modifications of a Microsoft SQL Server database. “Bulk Load Methods” and “Other Minimally Logged and Metadata Operations” provide an overview of two key and interrelated concepts for high-speed data loading: bulk loading and metadata operations. After this background knowledge, white paper describe how these methods can be [...]

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >