Search Results

Search found 4705 results on 189 pages for 'export to csv'.

Page 130/189 | < Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >

  • How can I send email attachment without using an additional library in Perl?

    - by CheeseConQueso
    Hey, I was wondering if there is a way to attach files (specifically .csv files) to a mail message in Perl without using MIME::Lite or any other libraries. Right now, I have a 'mailer function' that works fine, but I'm not sure how to adapt it into attaching files. Here is what I have: open(MAIL, "|/usr/sbin/sendmail -t"); print MAIL "To: cheese\@yahoo.com\n"; print MAIL "From: queso\@what.com\n"; print MAIL "Subject: Attached is $filename\n\n"; print MAIL "$message"; close(MAIL); I think this is specific for UNIX.

    Read the article

  • How do I force SSL for some URLs and force non-SSL for all others?

    - by brad
    I'd like to ensure that certain URLs on my site are always accessed via HTTPS while all other URLs are accessed via HTTP. I can get either case working in my .htaccess file, however if I enable both, then I get infinite redirects. My .htaccess file is: <IfModule mod_expires.c> # turn off the module for this directory ExpiresActive off </IfModule> Options +FollowSymLinks AddHandler application/x-httpd-php .csv RewriteEngine On RewriteRule ^/?registration(.*)$ /register$1 [R=301,L] # Force SSL for certain URL's RewriteCond %{HTTPS} off RewriteCond %{REQUEST_URI} (login|register|account) RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L] # Force non-SSL for certain URL's RewriteCond %{HTTPS} on RewriteCond %{REQUEST_URI} !(login|register|account) RewriteRule ^(.*)$ http://%{HTTP_HOST}%{REQUEST_URI} [R=301,L] # Force files ending in X to use same protocol as initial request RewriteRule \.(gif|jpg|jpeg|jpe|png|ico|css|js)$ - [S=1] # Use index.php as the controller RewriteCond %{REQUEST_URI} !\.(exe|css|js|jpe?g|gif|png|pdf|doc|txt|rtf|xls|swf|htc|ico)$ [NC] RewriteCond %{REQUEST_URI} !^(/js.*)$ RewriteRule ^(.*)$ index.php [NC,L]

    Read the article

  • magic records being deleted

    - by chris
    i have customised oscommerce to pull in a csv file of products, delete anything thats not with an image/proper description/proper title gets removed. The import runs on a cron job basis pulling information from a supplier, it hasnt run since yesterday but a product has disappeared- Anyone who has used oscommerce will know that, product information is stored over multiple tables. example is- products product_description and so on. the thing that has got me that the information is deleted from the product table but not from the product_description table. The product that is being deleted is a manually input one which carries a special tag/prefix on the model item of the product table. Therefore shouldn't be touched at all. Am clueles as what is going on. Is there mysql integrity checks deleting records? could there be another plugin working on oscommerce?

    Read the article

  • Excel data connection - remove header row?

    - by ekoner
    The excel spreadsheet is connecting to SQL server 2005 using the connection string below: Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=True;Initial Catalog=XXXXXX;Data Source=XXXXXX;Extended Properties="HDR=No";Use Procedure for Prepare=1;Auto Translate=True;Packet Size=4096;Workstation ID=XXXXXX;Use Encryption for Data=False;Tag with column collation when possible=False It then pulls data from a view into Excel. The business user wants this information without a header row. This will allow her to review then save as a "headless" csv in SAGE file format. I attempted to alter the connection string by adding HDR=No but that hasn't worked. Additionally, I can't delete the header row. Deleting the content replaces the column names with "Column 1" etc. Any ideas appreciated!

    Read the article

  • Grid View To Excel

    - by rahulchandran
    Hi I am trying to convert the contents of a grid View to an excel file and I am doing it using this code string attachment = "attachment; filename= " + FileName; Response.ClearContent(); Response.AddHeader("content-disposition", attachment); Response.ContentType = "application/excel"; StringWriter sw = new StringWriter(); HtmlTextWriter htw = new HtmlTextWriter(sw); gv.RenderControl(htw); Response.Write(sw.ToString()); Response.End(); The problem is I am getting some sort of html in an excel style format , theres java script in the page links etc what I want is to turn the results of my query into a comma seperated file Is that do-able for free or do I have to run the query myself get the data and write out a csv stream Thanks

    Read the article

  • Perl script to print out cars model and car color

    - by Gary Liggons
    I am tying to create a perl script to printout car models and colors, and the data is below. I want to know if there is anyway to make the car model heading a field so that I can print it any time I want to? the data below is a csv file. the way I want the data to look on a report is below as well This is how the data looks* Chevy blue,1978,Washington brown,1989,Dallas black,2001,Queens white,2003,Manhattan Toyota red,2003,Bronx green,2004,Queens brown,2002,Brooklyn black,1999,Harlem ****This is how I am trying to get the data to look in a report**** Car Model:Toyota Color:Red Year:2002 City: Queens

    Read the article

  • ruby on rails adding new route

    - by ohana
    i have an RoR application Log, which similar to the book store app, my logs_controller has all default action: index, show, update, create, delete.. now i need to add new action :toCSV, i defined it in logs_controller, and add new route in the config/routes as: map.resources :logs, :collection = { :toCSV = :get }. from irb, i checked the routes and see the new routes added already: rs = ActionController::Routing::Routes puts rs.routes GET /logs/toCSV(.:format)? {:controller="logs", :action="toCSV"} then ran ‘rake routes’ command in shell, it returned: toCSV_logs GET /logs/toCSV(.:format) {:controller="logs", :action="toCSV"} everything seems working. finally in my views code, i added the following: link_to 'Export to CSV', toCSV_logs_path when access it in the brower 'http://localhost:3000/logs/toCSV', it complained: Couldn't find Log with ID=toCSV i checked in script/server, and saw this one: ActiveRecord::RecordNotFound (Couldn't find Log with ID=toCSV): app/controllers/logs_controller.rb:290:in `show' seems when i click that link, it direct it to the action 'show' instead of 'toCSV', thus it took 'toCSV' as an id...anyone know why would this happen? and to fix it? Thanks...

    Read the article

  • Reading an Excel file in PHP

    - by Dinah
    I'm trying to read an Excel file (Office 2003). There is an Excel file that needs to be uploaded and its contents parsed. Via Google, I can only find answers to these related (and insufficient topics): generating Excel files, reading Excel XML files, reading Excel CSV files, or incomplete abandoned projects. I own Office 2003 so if I need any files from there, they are available. It's installed on my box but isn't and can't be installed on my shared host. Edit: so far all answers point to PHP-ExcelReader and/or this additional article about how to use it.

    Read the article

  • Write a file in UTF-8 using FileWriter (Java)?

    - by user1280970
    I have the following code however, I want it to write as a UTF-8 file to handle foreign characters. Is there a way of doing this, is there some need to have a parameter? I would really appreciate your help with this. Thanks. try { BufferedReader reader = new BufferedReader(new FileReader("C:/Users/Jess/My Documents/actresses.list")); writer = new BufferedWriter(new FileWriter("C:/Users/Jess/My Documents/actressesFormatted.csv")); while( (line = reader.readLine()) != null) { //If the line starts with a tab then we just want to add a movie //using the current actor's name. if(line.length() == 0) continue; else if(line.charAt(0) == '\t') { readMovieLine2(0, line, surname.toString(), forename.toString()); } //Else we've reached a new actor else { readActorName(line); } } } catch (IOException e) { e.printStackTrace(); } }

    Read the article

  • Numpy modify array in place?

    - by User
    I have the following code which is attempting to normalize the values of an m x n array (It will be used as input to a neural network, where m is the number of training examples and n is the number of features). However, when I inspect the array in the interpreter after the script runs, I see that the values are not normalized; that is, they still have the original values. I guess this is because the assignment to the array variable inside the function is only seen within the function. How can I do this normalization in place? Or do I have to return a new array from the normalize function? import numpy def normalize(array, imin = -1, imax = 1): """I = Imin + (Imax-Imin)*(D-Dmin)/(Dmax-Dmin)""" dmin = array.min() dmax = array.max() array = imin + (imax - imin)*(array - dmin)/(dmax - dmin) print array[0] def main(): array = numpy.loadtxt('test.csv', delimiter=',', skiprows=1) for column in array.T: normalize(column) return array if __name__ == "__main__": a = main()

    Read the article

  • How can I implement incremental (find-as-you-type) search on command line?

    - by florianbw
    I'd like to write small scripts which feature incremental search (find-as-you-type) on the command line. Use case: I have my mobile phone connected via USB, Using gammu --sendsms TEXT I can write text messages. I have the phonebook as CSV, and want to search-as-i-type on that. What's the easiest/best way to do it? It might be in bash/zsh/Perl/Python or any other scripting language. Edit: Solution: Modifying Term::Complete (http://search.cpan.org/~jesse/perl-5.12.0/lib/Term/Complete.pm) did what I want. See below for the answer.

    Read the article

  • Python MySQLdb LOAD LOCAL INFILE problems

    - by belvoir
    The problem is a simple one. When I execute the following I get different results depending on whether I run it from the MySQL console and from inside a Python Script using MySQLdb: LOAD DATA LOCAL INFILE '/tmp/source.csv' INTO TABLE test FIELDS TERMINATED BY '|' IGNORE 1 LINES; Console gives the following results: Records: 35002 Deleted: 0 Skipped: 0 Warnings: 0 Python (via .info()) returns the following: Records: 34977 Deleted: 0 Skipped: 0 Warnings: 8 So in summary, same source file, same SQL request, different results. From the console I can 'SHOW WARNINGS' an get a better handle on which records are causing the problems and why but from Python I can't idenitify how to do this or more importantly what the cause of the problem could be. Any suggestions? MySQL Server '5.1.41-3ubuntu12.1' Python '2.6.5' Tables are MyISAM

    Read the article

  • Condensing multiple else if statements, referencing them from a table?

    - by Haskella
    Hi I'm about to type some 500 else if statements into my code (PHP) which have the exact same coding each: if (color=White); rgb = 255-255-255; print rgb; else if (color=Black); rgb = 0-0-0; print rgb; else if (color=Red); rgb = 255-0-0; print rgb; else if (color=Blue); rgb = 0-0-255; print rgb; [the list keeps going] I also (luckily) have a table that displays the color in the first column and rgb value in the next column for 500 of them... how do I use this to save time typing all those else if statements? Some how I have to reference the table file (made in excel, I'm guessing I'll have to save it as .csv?)

    Read the article

  • formatting the output of a sql query.

    - by randeepsp
    Hi! I am using solaris os. from solaris im logging into sql plus. my database is oracle 9i. i am spooling the output of my query into a file. i want in .csv format so that i can copy it into excel. can any1 help me out. my query is like. select name,id,location from employee;

    Read the article

  • Http Geocoder (Google) Accuracy level

    - by sushruth
    I am geocoding a large amount of user entered addresses and interested in the accuracy levels returned. My GOAL is to get the BEST POSSIBLE ACCURACY score for a given address. I call the geocder api following way http://maps.google.com/maps/geo?q={address}&output=csv&sensor=false&key=xx now the accuracy levels returned for same address with/without premise name q = Key Arena, 305 Harrison Street, Seattle, WA 98109 (Accuracy is 5) q = 305 Harrison Street Seattle, WA 98109 (Accuracy is 8) q = Key Arena, Seattle, WA 98109 (Accuracy is 9.) Its obvious from the above that the google servers does not return the best accuracy when street name is appended with premise/venue. the question is :) is there a way to pass the complete address ( with premise name / i.e case 1 ) and get the Max Accuracy. ( or how can tell the google server that the address is passed with premise/building name and street name) ( if you are thinking why not just use case 3, the answer is these are user entered addresses, they could enter "my moms's house" for premise, with accurate street address. in which case i want the accuracy to be 8 not 5)

    Read the article

  • Getting "select permission denied" when using LINQ but my account is a sysadmin

    - by Wayne M
    I have a console app that's geared to be automatically ran as a Scheduled Task. I use LINQ to SQL to pull some data out of the database, format it into a CSV and email it to a client. All of a sudden I am getting the error "SELECT permission denied for table", but the account I'm using to connect to the database (specified in my app.config file) has the "sysadmin" server role (bad programmer, I know; I'll get around to changing it to a better account later but I want to make sure it works first). I can connect directly to the SQL database using that very same account and query the table in question without a problem, it only seems to be when using the LINQ code. Any idea what would be causing this?

    Read the article

  • Sqlite3 Database versus populating Arrays

    - by Kenoy
    hi, I am working on a program that requires me to input values for 12 objects, each with 4 arrays, each with 100 values. (4800) values. The 4 arrays represent possible outcomes based on 2 boolean values... i.e. YY, YN, NN, NY and the 100 values to the array are what I want to extract based on another inputted variable. I previously have all possible outcomes in a csv file, and have imported these into sqlite where I can query then for the value using sql. However, It has been suggested to me that sqlite database is not the way to go, and instead I should populate using arrays hardcoded. Which would be better during run time and for memory management?

    Read the article

  • Powershell and some simple string manipulation

    - by Pat
    need some help with building a powershell script to help with some basic string manipulation. I know just enough powershell to get in trouble, but can't figure out the syntax or coding to make this work. I have a text file that looks like this - Here is your list of servers: server1 server2.domain.local server3 Total number of servers: 3 I need to take that text file and drop the first and last lines (Always first and last.) Then I need to take every other line and basically turn it into a CSV file. The final output should be a text file that looks like this - server1,server2.domain.local,server3 Any suggestions on where to start? Thanks!

    Read the article

  • Sorting tab delimited text file based on multiple columns in natural way [duplicate]

    - by Vignesh
    This question already has an answer here: Sorting a column of CSV file resulting in 1123 appearing before 232 1 answer I am trying to sort a file based on all two columns Eg: chr19 1070019 1070020 chr16 869712 869713 chr1 1378131 1378132 chr12 189386 189387 chr4 254941 254942 chr16 1476500 1476501 chr2 1476810 1476811 chr19 313283 313284 chr17 595817 595818 chr18 656897 656898 chr19 1061829 1061830 I Tried sort -t $\t -k1,1 2,2 <filename> but doesn't work. I want the output to be sorted by first column and second column based on first column. I want to do a natural sort. Not lexical sorting. Eg: chr1 1378131 1378132 chr2 1476810 1476811 chr4 254941 254942 chr12 189386 189387 chr16 869712 869713 chr16 1476500 1476501 chr17 595817 595818 chr18 656897 656898 chr19 313283 313284 chr19 1061829 1061830 chr19 1070019 1070020 Anyone any idea?

    Read the article

  • Creating an Excel Template for different data size

    - by dassouki
    I created an excel template for a file i've done for a routine work calculation. The file takes data from the data logger and does some analysis on it and outputs one number regardless of the input size. The problem I'm having is i have to modify the sheet to suit the number of rows, as everyday the data logger outputs a different number of rows. there are about 15 sheets in the workbook and it's annoying to have to change everyone of them everyday. What i'd like to do input the data logger csv, and boom the result gets outputted. Is there a way through vba or not to ahieve

    Read the article

  • How to debug/reformat C printf calls with lots of arguments in vim?

    - by Costi
    I have a function call in a program that I'm maintaining has 28 arguments for a printf call. It's printing a lot of data in a CSV file. I have problems following finding where what goes and I have some mismatches in the parameters types. I enabled -Wall in gcc and I get warnings like: n.c:495: warning: int format, pointer arg (arg 15) n.c:495: warning: format argument is not a pointer (arg 16) n.c:495: warning: double format, pointer arg (arg 23) The function is like this: fprintf (ConvFilePtr, "\"FORMAT3\"%s%04d%s%04d%s%s%s%d%s%c%s%d%c%s%s%s%s%s%s%s%11.lf%s%11.lf%s%11.lf%s%d\n", some_28_arguments_go_here); I would like to know if there is a vim plugin that highlights the printf format specifier when i go with the cursor over a variable. Other solutions? How to better reformat the code to make it more readable?

    Read the article

  • Excel Spreadsheet - Best way to perform an Oracle Query on a cell

    - by Jamie
    Hi there, I have an Excel Spreadsheeet. There is a cell containing a concatenated name and surname (don't ask why), for example: Cell A2 BLOGGSJOE On this cell, I would like to run the following SQL and output it to cell A3, A4 and A5 SELECT i.id, i.forename, i.surname FROM individual i WHERE UPPER(REPLACE('" & A2 & "', ' ', '')) = UPPER(REPLACE(i.surname|| i.forename, ' ', '')) AND NVL(i.ind_efface, 'N') = 'N' Any idea how I could perform an oracle query on each cell and return the result? I have enabled an oracle datasource connection in Excel, just not sure what to do now. Is this a stupid approach, and can you recommend a better more proficient way? Thanks muchly! I lack the necessary experience in this type of thing! :-) EDIT: I am aware that I could just write a simple ruby/php/python/whatever script to loop through the excel spreadsheet (or csv file), and then perform the query etc. but i thought there might be a quick way in excel itself.

    Read the article

  • Trouble getting email attachment from Exchange

    - by JimR
    I am getting the error message “The remote server returned an error: (501) Not Implemented.” when I try to use the HttpWebRequest.GetResponse() using the GET Method to get an email attachment from exchange. I have tried to change the HttpVersion and don’t think it is a permissions issue since I can search the inbox. I know my credentials are correct as they are used to get HREF using the HttpWebRequest.Method = Search on the inbox (https://mail.mailserver.com/exchange/testemailaccount/Inbox/). HREF = https://mail.mailserver.com/exchange/testemailaccount/Inbox/testemail.EML/attachment.csv Sample Code: HttpWebRequest req = (System.Net.HttpWebRequest) HttpWebRequest.CreateHREF); req.Method = "GET"; req.Credentials = this.mCredentialCache; string data = string.Empty; using (WebResponse resp = req.GetResponse()) { Encoding enc = Encoding.Default; if (resp == null) { throw new Exception("Response contains no information."); } using (StreamReader sr = new StreamReader(resp.GetResponseStream(), Encoding.ASCII)) { data = sr.ReadToEnd(); } }

    Read the article

  • Preview result of update/insert query without comitting changes to database in MySQL?

    - by Camsoft
    I am writing a script to import CSV files into existing tables within my database. I decided to do the insert/update operations myself using PHP and INSERT/UPDATE statements, and not use MySQL's LOAD INFILE command, I have good reasons for this. What I would like to do is emulate the insert/update operations and display the results to the user, and then give them the option of confirming that this is OK, and then committing the changes to the database. I'm using InnoDB database engine with support for transactions. Not sure if this helps but was thinking down the line of insert/update, query data, display to user, then either commit or rollback transaction? Any advise would be appreciated.

    Read the article

  • Doing a ajax / json add to database, and have a "wait doing operation" icon

    - by Dejan.S
    Hi. I got a part on my page I want to improve. It's a file upload that users can add their contacts from files like excel, csv & outlook. I read the contacts and place them in the database, so what I would like to do is to have a regular icon that spins while that operation is doing that, how could I do that? Ajax? I don't want progress bar for the file upload but the operation for reading the file EDIT: I want to know how to make this work with the add to database using ajax. like should I use a updatepanel? Thanks

    Read the article

< Previous Page | 126 127 128 129 130 131 132 133 134 135 136 137  | Next Page >