Search Results

Search found 4099 results on 164 pages for 'bulk export'.

Page 21/164 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • BULK SMS, Long Codes (VMN MSIDN), T-mobile?

    - by John
    Does any US wireless carrier offer individuals or companies with a direct connection to the SMSC? The number is 747-772-3101 (repalce 7's with 6's) This number is registered to t-mobile, also verified by t-mobile to be a valid subscriber sending 160,000+ text messages monthly and that all they have is an unlimited text messaging plan on top of the cheapest voice plan. This company of the number verified to me that they don't use gsm modems as they are too slow. So I know it's possible but who would I contact, Sales or anyone else reachable through a 1-800 is ignorant to these services and developer.t-mobile is worthless and doesn't reply to emails. Any info??

    Read the article

  • Visual Studio 2010 - Export (Project) Template menu option grayed out

    - by Jakobud
    In Visual Studio, I want to make a simple C++ project and export it out as a template, so I can use the template to start new projects to save me time. But the Export Template menu option is always grayed out. I've not once been able to click it. Anyone know why? Anyone know how to accomplish what I need (besides the obvious "make a copy of an existing project in explorer")? It seems like project templates should be a no-brainer feature for VS. This seems to be the case for Visual Studio 2005, 2010 (I probably 2008 as well I haven't checked).

    Read the article

  • Export Sharepoint 2007 Custom List as RSS File

    - by matt
    Here's our scenario: We've created a sharepoint 2007 calendar on our intranet site We want to run a daily job to export a subset of the events to an rss file Another job will move the rss file to our public web site We have some funny restrictions where we can't simply publish the rss feed to the public. We have to go this export route. I'm not clear on how to accomplish step 2. Ideally, we wouldn't have to write a lot of custom code to accomplish this. Thanks.

    Read the article

  • Need help with Drupal bulk mail low open rate for legitimate mailing list

    - by Ron Williams
    I've moved from constant contact to Drupal Simplenews/Mimemail/SMTP. Previously the open rate was around 50% for constant contact, but now it's 4-5% for the same list via the mentioned setup. Mail is getting out from the server, but it's having an issue anyway. Here's the setup: -The e-mail list consists of approximately 80,000 addresses which is queued at 10,000 e-mails per cron run (which runs hourly). -The server is a Dual Core2Quad machine with 2GB of RAM. -When mail is being sent, the mail queue will usually go up to ~1000 at the beginning of the hour before reducing to ~250 by the time the next cron occurs. -Newsletter is themed to display custom style for newsletter on send -Newsletter is received by some, but appears to be bounced by many (based on low open rate_ -I've added SPF, domain keys, and a PTR record to the DNS -Server hostname (listed in ptr) is different from hosted domain -Very low spam number via Spamassassin -IP and domain are not blacklisted -Mail goes out via SMTP module on delivery. Any ideas?

    Read the article

  • Incorrect emacs indentation in a C++ class with DLL export specification

    - by Michael Daum
    I often write classes with a DLL export/import specification, but this seems to confuse emacs' syntax parser. I end up with something like: class myDllSpec Foo { public: Foo( void ); }; Notice that the "public:" access spec is indented incorrectly, as well as everything that follows it. When I ask emacs to describe the syntax at the beginning of the line containing public, I get a return of: ((label 352)) If I remove the myDllSpec, the indentation is correct, and emacs tells me that the syntax there is: ((inclass 352) (access-label 352)) Which seems correct and reasonable. So I conclude that the syntax parser is not able to handle the DLL export spec, and that this is what's causing my indentation trouble. Unfortunately, I don't know how to teach the parser about my labels. Seems that this is pretty common practice, so I'm hoping there's a way around it.

    Read the article

  • Bulk update + SQL + self join

    - by Nev_Rahd
    Hello All I would like to update a column in Table with reference to other column(s) in same table. Ex: As in figure below - I would like to update effective endate with min date whichever is greater than effective Start Date of currrent record. How can this be acheived in T-SQL. Can this be done with single update statement ? Thanks.

    Read the article

  • Export Multiple Sheets to Excel Through Browser

    - by ProfK
    I need to export multiple data tables to Excel on the clients machine, each to their own sheet. If it was just one sheet, I'd use the Excel/csv content type, but I've heard something about an XML format that can represent an entire workbook. I don't want to go down the Packaging and .xlsx route, so I need standard .xls. Our bug tracker, Gemini, used to have an export function that produced an XML file that Excel automatically opened as a multi-sheet workbook, but I can't find it. Is there still such a mechanism, and where can I find that schema?

    Read the article

  • export excel taking long time from ASP pages?

    - by ricky
    i am using following code for export to excel from .ASP page? GMID = Request.QueryString ("GMID") Response.Buffer = False Response.ContentType = "application/vnd.ms-excel" DIR_YR = Request.QueryString ("DIR_YR") CD = Request.QueryString("CD") YEAR = Request.QueryString("IND") Problem that I am facing is that When records are around 2,000 or more{ export to excel ask for open option .When i click on that option only Download in progress... shown but actually no excel pop up will open .How can I fixed this bug because for 700-800 rows its working Fine. I am not looking for whole change codes because there is a problem with only One Sale rep who is having more than 2000 rows.I am looking for one or two rows changes.

    Read the article

  • Enable export to XML via HTTP on a large number of models with child relations

    - by Vasil
    I've a large number of models (120+) and I would like to let users of my application export all of the data from them in XML format. I looked at django-piston, but I would like to do this with minimum code. Basically I'd like to have something like this: GET /export/applabel/ModelName/ Would stream all instances of ModelName in applabel together with it's tree of related objects . I'd like to do this without writing code for each model. What would be the best way to do this?

    Read the article

  • Bulk Compare, Report, Update

    - by Tim Donaldon
    I need to import either csv or excel file into a dbase. The column headers will match but I will want to compare the file against the dbase using an ItemID field, list the rows to be affected and the differences, then allow an update to all the rows with the matching ID.

    Read the article

  • Export Flash as Transparent MOV

    - by Chris Nicol
    Is it possible to export a flash movie with a transparent background as a .MOV. I don't mean for embedding in a website, I mean the actual .MOV (or .avi). What I'm trying to accomplish is that I have a flash animation that I want to embed in a WPF application. I don't want to use a Browser within the WPF because of all of the issues that surround the browser control (has to be topmost control, etc). So my solution was to export said animation as a movie and play it in the MediaElement control. The only problem is that I need the background to be transparent, and I can't find a way to do this. Any suggestions or alternative solutions would be most welcome.

    Read the article

  • Unable to export runnable jar, launch configuration grayed out

    - by user13107
    I am not able to figure out how to export a runnable jar in eclipse. I have a java project (project A) (written by someone else) which when imported in Eclipse, I can click Build Project and it will create a projectName.jar file under bin/ directory. That jar file contains binary *.class files. This jar file is added as external library for another java project (project B) which I want to debug. But because all the class files are binary I'm not able to do line-by-line debugging. I tried exporting Runnable Jar in Eclipse, for that I have to select a Launch Configuration. But there is no main class in project A. (I recursively grepped for main and didn't find any). What can I do to export jar of project A that contains respective source code also (which will be used in line-by-line debugging)?

    Read the article

  • DataSet XML export is empty

    - by Shaine
    I've got in-memory dataset with couple of tables that is populated in code. Data-bound grids on the gui show table contents without a problem. Then I try to export the dataset into XML: ds.WriteXml(fdSave.FileName, XmlWriteMode.WriteSchema); and get empty XML (with couple of lines regarding dataset names but without any tables) If I export table directly I've got all the data but dataset name is obviously wrong: ds.Fields.WriteXml(fdSave.FileName, XmlWriteMode.WriteSchema); What am I missing? Is there any reasonable way to write the whole dataset into file?

    Read the article

  • How to do bulk update of views?

    - by Shaul
    My database has about 30 views, most of which have a reference to another database on this server (call it DB1). Now, without going into the reasons why, I need to update all those views to DB2, also on the local server. I would hate to have to do this manually on each view. Is there some SQL query I can run that will replace all occurrences of the string 'DB1' with 'DB2' in all my views?

    Read the article

  • sql server bulk copy out/postgres copy from infile

    - by Chris Curvey
    I'm starting a conversion of a system from MS SQL Server to Postgres. I have the table structures converted, and I use "bcp" to get the data out of SQL Server. ERROR: invalid byte sequence for encoding "UTF8": 0x80 HINT: This error can also happen if the byte sequence does not match the encoding expected by the server, which is controlled by "client_encoding". CONTEXT: COPY cm_outgoing, line 200: "200 c:\temp\200.xml 2009-10-10 01:50:44.000 1900-01-01 00:00:00.000" I've already used "sed" to get rid of the NUL (0x00) entries in the file, and I can't find any instances of 0x80 in the file that I'm trying to import. Any thoughts? Is there an easier way?

    Read the article

  • Emacs bulk indent for Python

    - by Vernon
    Working with Python in Emacs if I want to add a try/catch to a block of code, I often find that I am having to indent the whole block, line by line. In Emacs, how do you indent the whole block at once. I am not an experienced Emacs user, but just find it is the best tool for working through ssh. I am using Emacs on the command line(Ubuntu), not as a gui, if that makes any difference.

    Read the article

  • Bulk update & occasional insert (coredata) - Too slow

    - by Andrew
    Update: Currently looking into NSSET's minusSet links: http://stackoverflow.com/questions/1475636/comparing-two-arrays Hi guys, Could benefit from your wisdom here.. I'm using Coredata in my app, on first launch I download a data file and insert over 500 objects (each with 60 attributes) - fast, no problem. Each subsequent launch I download an updated version of the file, from which I need to update all existing objects' attributes (except maybe 5 attributes) and create new ones for items which have been added to the downloaded file. So, first launch I get 500 objects.. say a week later my file now contains 507 items.. I create two arrays, one for existing and one for downloaded. NSArray *peopleArrayDownloaded = [CoreDataHelper getObjectsFromContext:@"person" :@"person_id" :YES :managedObjectContextPeopleTemp]; NSArray *peopleArrayExisting = [CoreDataHelper getObjectsFromContext:@"person" :@"person_id" :YES :managedObjectContextPeople]; If the count of each array is equal then I just do this: NSUInteger index = 0; if ([peopleArrayExisting count] == [peopleArrayDownloaded count]) { NSLog(@"Number of people downloaded is same as the number of people existing"); for (person *existingPerson in peopleArrayExisting) { person *tempPerson = [peopleArrayDownloaded objectAtIndex:index]; // NSLog(@"Updating id: %@ with id: %@",existingPerson.person_id,tempPerson.person_id); // I have 60 attributes which I to update on each object, is there a quicker way other than overwriting existing? index++; } } else { NSLog(@"Number of people downloaded is different to number of players existing"); So now comes the slow part. I end up using this (which is tooooo slow): NSLog(@"Need people added to the league"); for (person *tempPerson in peopeArrayDownloaded) { NSPredicate *predicate = [NSPredicate predicateWithFormat:@"person_id = %@",tempPerson.person_id]; // NSLog(@"Searching for existing person, person_id: %@",existingPerson.person_id); NSArray *filteredArray = [peopleArrayExisting filteredArrayUsingPredicate:predicate]; if ([filteredArray count] == 0) { NSLog(@"Couldn't find an existing person in the downloaded file. Adding.."); person *newPerson = [NSEntityDescription insertNewObjectForEntityForName:@"person" inManagedObjectContext:managedObjectContextPeople]; Is there a way to generate a new array of index items referring to the additional items in my downloaded file? Incidentally, on my tableViews I'm using NSFetchedResultsController so updating attributes will call [cell setNeedsDisplay]; .. about 60 times per cell, not a good thing and it can crash the app. Thanks for reading :)

    Read the article

  • Do partitions allow multiple bulk loads?

    - by ck
    I have a database that contains data for many "clients". Currently, we insert tens of thousands of rows into multiple tables every so often using .Net SqlBulkCopy which causes the entire tables to be locked and inaccessible for the duration of the transaction. As most of our business processes rely upon accessing data for only one client at a time, we would like to be able to load data for one client, while updating data for another client. To make things more fun, all PKs, FKs and clustered indexes are on GUID columns (I am looking at changing this). I'm looking at adding the ClientID into all tables, then partitioning on this. Would this give me the functionality I require?

    Read the article

  • bulk insert from Java into Oracle

    - by Will Glass
    I need to insert many small rows rapidly into Oracle. (5 fields). With MySQL, I break the inserts into groups of 100, then use one insert statement for every group of 100 inserts. But with Oracle, user feedback is that the mass inserts (anywhere from 1000-30000) are too slow. Is there a similar trick I can use to speed up the programmatic inserts from Java into Oracle?

    Read the article

  • After Effects Question

    - by Josh
    Question here, not sure if it's the correct stack exchange site, sorry if it isn't. I have an After Effects project for school, and I've created a movie using JPEG sequences (10 @ ~100-200mb/each ). I have the output setting on the composition set to 640x480. I resized each JPEG layer via the fit to comp tool, but when I export the movie as a Quicktime movie, it is 1.1 gig for ~35 seconds of movie at 30fps. What am I doing so horribly wrong here?

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >