Search Results

Search found 60932 results on 2438 pages for 'data operations'.

Page 249/2438 | < Previous Page | 245 246 247 248 249 250 251 252 253 254 255 256  | Next Page >

  • Read header data from files on remote server

    - by rejeep
    Hi! I'm working on a project right now where I need to read header data from files on remote servers. I'm talking about many and large files so I cant read whole files, but just the header data I need. The only solution I have is to mount the remote server with fuse and then read the header from the files as if they where on my local computer. I've tried it and it works. But it has some drawbacks. Specially with FTP: Really slow (FTP is compared to SSH with curlftpfs). From same server, with SSH 90 files was read in 18 seconds. And with FTP 10 files in 39 seconds. Not dependable. Sometimes the mountpoint will not be unmounted. If the server is active and a passive mounting is done. That mountpoint and the parent folder gets locked in about 3 minutes. Does timeout, even when there's data transfer going (guess this is the FTP-protocol and not curlftpfs). Fuse is a solution, but I don't like it very much because I don't feel that I can trust it. So my question is basically if there's any other solutions to the problem. Language is preferably Ruby, but any other will work if Ruby does not support the solution. Thanks!

    Read the article

  • TCP socket communication

    - by raven
    hello, I am creating a Chat in java. I have a method (onMouseRelease) inside an object that creates a tcp server and waits for a socket like this: ServerSocket server = new ServerSocket(port); Socket channel = server.accept(); Now I want to make a thread that will loop and read data from the socket, so that once the user on the other side sends me a string, I will extract the data from the socket (or is it called packet? Sorry, I am new to this) and update a textbox to add the additional string from the socket (or packet?). I have no idea how to READ (extract) the information from the socket(/packet) and then update it into a JTextArea which is called userOutput. And how to send a string to the other client, so that it will also could read the new data and update its JTextArea. From what I know, for a 2 sided TCP communication you need one computer to host a server and the other to connect (as a client) and once the connection is set the client can also receive new information from the socket. Is that true? and please tell me how. Any help is appreciated! I know this is a bit long but I have searched a lot and didn't understand it (I saw something like PrintWriter but failed to understand).

    Read the article

  • Rails modeling for a user

    - by Trevor Hartman
    When building a rails app that allows a User to login and create data, is it best to setup a belongs_to :user association on every single model? For example, let's say a user can create Favorites, Colors and Tags. And let's say Favorites has_many :tags and Colors also has_many :tags. Is it still important for Tags to belong_to :user assuming the User is the only person who has authority to edit those tags? And a similar question along the same lines: When updating data in FavoritesController, I've come to the conclusion that you perform CRUD operations by always doing something like User.favorites.find(params[:id].update_attributes(param[:favorite]) so that they can definitely only update models that belong to them. Right?

    Read the article

  • Serial Data Not Transmitted in C# Application

    - by Jim Fell
    Hello. I have a C# application wherein serial (COM1) data appears to sometimes not get transmitted. Following is a simplified snippet of my code (calls to textBox writes have been removed): try { serialPort1.Write("D"); serialPort1.Write(msg, 0, 512); serialPort1.Write("d"); serialPort1.Write(pCsum, 0, 2); } catch (SystemException ex) { /* ... */ } What is odd is that this same code works just fine when the port is configured for 115.2Kbps. However, when running at 9600bps data that should be transmitted by this code seems to not get transmitted. I have verified this by monitoring the receive flag on the remote device. No exceptions are thrown from within the try statement. Is there something else (Flush, etc.) that I should be doing to make sure the data is transmitted? Any thoughts or suggestions you may have would be appreciated. I'm using Microsoft Visual C# 2008 Express Edition. Thanks.

    Read the article

  • Export to csv, string w/ comma in it, splits it up

    - by Brad
    This code exports data into a csv file, which is opened within Excel. When a string has a comma within it, it messes up the order of the data. I need help modifying my code below to resolve any data that contains a comma within it, to not to create a new column. I am assuming it will pass each string within double quotes, so any comma within those quotes, then it will make an exception. Any help is appreciated. $result = mysql_query("select lname, fname, email, dtelephone, etelephone, contactwhen, comments, thursday, friday, saturday, sunday, monday FROM volunteers_2010"); $csv_output .= "Last Name,First Name,Email,Telephone (Day),Telephone (Evening),Contact When,Comments,Thursday,Friday,Saturday,Sunday,Monday,Comments\n"; $i = 0; if (mysql_num_rows($result) > 0) { while ($row = mysql_fetch_assoc($result)) { $csv_output .= $row['Field'].", "; $i++; } } $csv_output .= "\n"; $values = mysql_query("SELECT lname, fname, email, dtelephone, etelephone, contactwhen, comments, thursday, friday, saturday, sunday, monday FROM volunteers_2010 WHERE venue_id = $venue_id"); while ($rowr = mysql_fetch_row($values)) { for ($j=0;$j<$i;$j++) { $csv_output .= $rowr[$j].", "; } $csv_output .= "\n"; }

    Read the article

  • Saving Data to Relational Database (Entity Framework)

    - by sheefy
    I'm having a little bit of trouble saving data to a database. Basically, I have a main table that has associations to other tables (Example Below). Tbl_Listing ID UserID - Associated to ID in User Table CategoryID - Associated to ID in Category Table LevelID - Associated to ID in Level Table. Name Address Normally, it's easy for me to add data to the DB (using Entity Framework). However, I'm not sure how to add data to the fields with associations. The numerous ID fields just need to hold an int value that corresponds with the ID in the associated table. For example; when I try to access the column in the following manner I get a "Object reference not set to an instance of an object." error. Listing NewListing = new Listing(); NewListing.Tbl_User.ID = 1; NewListing.Tbl_Category.ID = 2; ... DBEntities.AddToListingSet(NewListing); DBEntities.SaveChanges(); I am using NewListing.Tbl_User.ID instead of NewListing.UserID because the UserID field is not available through intellisense. If I try and create an object for each related field I get a "The relationship between the two objects cannot be defined because they are attached to different ObjectContext objects." error. With this method, I am trying to add the object without the .ID shown above - example NewListing.User = UserObject. I know this should be simple as I just want to reference the ID from the associated table in the main Listing's table. Any help would be greatly appreciated. Thanks in advance, -S

    Read the article

  • How can I [simply] consume JSON Data to display to the page

    - by Atomiton
    I usually use JSON with jQuery to just return a string with html. However, I want to start to use Javascript objects in my code. What's the simplest way to get started using json objects on my page? Here's a sample Ajax call ( after $(document).ready( { ... }) of course: $('#btn').click(function(event) { event.preventDefault(); var out = $('#result'); $.ajax({ url: "CustomerServices.asmx/GetCustomersByInvoiceCount", success: function(msg) { // // Iterate through the json results and spit them out to a page? // }, data: "{ 'invoiceCount' : 100 }" }); }); My WebMethod: [WebMethod(Description="Gets customers with more than n invoices")] public List<Customer> GetCustomersByInvoiceCount(int? invoiceCount) { using (dbDataContext db = new dbDataContext()) { return db.Customers.Where(c => c.InvoiceCount >= invoiceCount); } } What gets returned: {"d":[{"__type":"Customer","Account":"1116317","Name":"SOME COMPANY","Address":"UNit 1 , 392 JOHN ST. ","LastTransaction":"\/Date(1268294400000)\/","HighestBalance":13922.34},{"__type":"Customer","Account":"1116318","Name":"ANOTHER COMPANY","Address":"UNIT #345 , 392 JOHN ST. ","LastTransaction":"\/Date(1265097600000)\/","HighestBalance":549.42}]} What I'd LIKE to know, is what are people generally doing with this returned json? Do you iterate through the properties and create an html table on the fly? Is there way to "bind" JSON data using a javascript version of reflection ( something like the .Net GridView Control ) Do you throw this returned data into a Javascript Object and then do something with it? An example of what I want to achieve is to have an plain ol' html page ( on a mobile device )with a list of a Salesperson's Customers. When one of those customers are clicked, the customer id gets sent to a webservice which retrieves the customer details that are relevant to a sales person. I know the SO talent pool is quite deep so I figured you all here would be able to guide in the right direction and give me a few ideas on the best way to approach this.

    Read the article

  • sending the data from form to db in django

    - by BharatKrishna
    I have a form in which I can input text through text boxes. How do I make these data go into the db on clicking submit. this is the code of the form in the template. <form method="post" action="app/save_page"> <p> Title:<input type="text" name="title"/> </p> <p> Name:<input type="text" name="name"/> </p> <p> Phone:<input type="text" name="phone"/> </p> <p> Email:<input type="text" name="email"/> </p> <p> <textarea name="description" rows=20 cols=60> </textarea><br> </p> <input type="submit" value="Submit"/> </form> I have a function in the views.py for saving the data in the page. But I dont know how to impliment it properly: def save_page(request): title = request.POST["title"] name = request.POST["name"] phone = request.POST["phone"] email = request.POST["email"] description = request.POST["description"] Now how do I send these into the db? And what do I put in views.py so that those data goes into the db? so how do I open a database connection and put those into the db and save it? should I do something like : connection=sqlite3.connect('app.db') cursor= connection.cursor() ..... ..... connection.commit() connection.close() Thank you.

    Read the article

  • how to fetch data using jquery

    - by user3566029
    I have tried to connect my index.php file in 000webhost.com using jQuery. From the site menu I have connected with 000webhost using FTP details available in 000webhost.com. When I tried to read data from jQuery it doesn’t working. Anyone please tell me what should I need to change? Should I need to connect my Dreamweaver database to webhost? If yes please explain? This is what I have done. index.php $mysql_host = "mysql0.000webhost.com"; $mysql_database = "a000000_mydb"; $mysql_user = "a000000_root"; $mysql_password = "******"; $conn=@mysql_connect ( $mysql_host,$mysql_user,$mysql_password )or die('aa'); mysql_select_db($mysql_database,$conn) or die('eoor on db'); $quer=mysql_query("SELECT * FROM sam"); $res=array(); while($row=mysql_fetch_row($quer)) { $res[]=$row; } print(json_encode($res)); return json_encode($res); mysql_close(); index.html $.get('public_html/index.php', function( data ) { alert( 'Successful' +data); });

    Read the article

  • Populate an Object Model from a data dataTable(C#3.0)

    - by Newbie
    I have a situation I am getting data from some external sources and is populating into the datatable. The data looks like this DATE WEEK FACTOR 3/26/2010 1 RM_GLOBAL_EQUITY 3/26/2010 1 RM_GLOBAL_GROWTH 3/26/2010 2 RM_GLOBAL_VALUE 3/26/2010 2 RM_GLOBAL_SIZE 3/26/2010 2 RM_GLOBAL_MOMENTUM 3/26/2010 3 RM_GLOBAL_HIST_BETA I have a object model like this public class FactorReturn { public int WeekNo { get; set; } public DateTime WeekDate { get; set; } public Dictionary<string, decimal> FactorCollection { get; set; } } As can be seen that the Date field is always constant. And a single(means unique) week can have multiple FACTORS. i.e. For a date(3/26/2010), for Week No. 1, there are two FACTORS(RM_GLOBAL_EQUITY and RM_GLOBAL_GROWTH). Similarly, For a date(3/26/2010), for Week No. 2, there are three FACTORS(RM_GLOBAL_VALUE , RM_GLOBAL_SIZE and RM_GLOBAL_MOMENTUM ). Now we need to populate this data into our object model. The final output will be WeekDate: 3/26/2010 WeekNo : 1 FactorCollection : RM_GLOBAL_EQUITY FactorCollection : RM_GLOBAL_GROWTH WeekNo : 2 FactorCollection : RM_GLOBAL_VALUE FactorCollection : RM_GLOBAL_SIZE FactorCollection : RM_GLOBAL_MOMENTUM WeekNo : 3 FactorCollection : RM_GLOBAL_HIST_BETA That is, overall only 1 single collection, where the Factor type will vary depending on week numbers. I have tried but of useless. Nothing works. Could you please help me?. I feel it is very tough I am using C# 3.0 Thanks

    Read the article

  • Java - Display % of upload done

    - by tr-raziel
    I have a java applet for uploading files to server. I want to display the % of data sent but when I use ObjectOutputStream.write() it just writes to the buffer, does not wait until the data has actually been sent. How can I achieve this. Perhaps I need to use thread synchronization or something. Any clues would be most helpful. This is the code I'm using right now: try{ for(File file : ficheiros){ FileInputStream stream = new FileInputStream (file); int bytesRead1 = 0;; int off1 = 0; int len1 = 100000; if(file.length() < 100000) len1 = new Long(file.length()).intValue(); byte[] bytes1 = new byte[len1]; while (off1 < file.length()) { bytes1 = new byte[len1]; if((file.length() - off1) < len1){ len1 = (new Long(file.length()).intValue() - off1); bytes1 = new byte[len1]; } if((bytesRead1 = stream.read(bytes1)) != -1){ //I want this to block until all data has been sent outputToServlet.write(bytes1, 0, bytesRead1 ); System.out.println("off1: " + off1); off1 = off1 + len1; outputToServlet.flush(); } sent += len1; if(sent>totalLength) sent = (int)totalLength; updateFeedback(sent,totalLength,false);//calls method to display % } updateFeedback(-1,-1,true); } }catch(Exception e){ e.printStackTrace(); } Thanks

    Read the article

  • Problem posting multipart form data using Apache with mod_proxy to a mongrel instance

    - by Ryan E
    I am attempting to simulate my site's production environment as closely as I can on my local machine. This is a rails site that uses Apache w/ mod_proxy to forward requests to a mongrel cluster. On my Mac OSX Leopard machine, I have the default install of apache running and have configured a vhost to use mod_proxy to to forward requests to a local running mongrel instance on port 3000. <Proxy balancer://mongrel_cluster-development> BalancerMember http://127.0.0.1:3000 </Proxy> For the most part, this is working fine. I can browse my development site using the ServerName of the vhost I configured and can confirm that requests are being properly forwarded to the mongrel instance. However, there is a page on the site that has a multipart form that is used to upload an image to the server. When I post this form, there is a delay of about 5 minutes and the browser ultimately returns a Bad Request Your browser sent a request that this server could not understand. In the error log for my vhost: [Tue Sep 22 09:47:57 2009] [error] (70007)The timeout specified has expired: proxy: prefetch request body failed to 127.0.0.1:3000 (127.0.0.1) from ::1 () This same form works fine if I browse directly to the mongrel instance (http://127.0.0.1:3000). Anybody have any idea what the problem might be and how to fix it? If there is any important information that I neglected to include, post a comment, and I can add to this question. Note: Upon further investigation, this appears to be a problem specific to Safari. The form works fine in Firefox.

    Read the article

  • Microsoft DNS :: Review Traffic Data?

    - by BSI Support
    I have primary and secondary (public) DNS servers running Microsoft DNS. Windows 2003. Would like any suggestions on software for DNS log analysis. At the moment, I'm curious to see which domains are being requested, in an effort to clean-up old Registrar-based name server settings (which I do not have access to change, myself.) Any help would be appreciated.

    Read the article

  • How should I fix problems with file permissions while restoring from Time Machine?

    - by Andrew Grimm
    While restoring files from a Time Machine backup, I got the error message "The operation can’t be completed because you don’t have permission to access some of the items." because of problems with files in one folder. What's the safest way to deal with this? The folder in question has permissions like: Andrew-Grimms-MacBook-Pro:kmer agrimm$ pwd /Volumes/Time Machine Backups/Backups.backupdb/Andrew Grimm’s MacBook Pro/2010-12-09-224309/Macintosh HD/Users/agrimm/ruby/kmer Andrew-Grimms-MacBook-Pro:kmer agrimm$ ls -ltra total 6156896 drwxrwxrwx@ 19 agrimm staff 680 18 Jan 2008 Saccharomyces_cerevisiae -r--------@ 1 agrimm staff 60221852 4 Aug 2009 hs_ref_GRCh37_chrY.fa -r--------@ 1 agrimm staff 157488804 4 Aug 2009 hs_ref_GRCh37_chrX.fa (snip a few files) -r--------@ 1 agrimm staff 676063 27 Oct 2009 NC_001143.fna -rw-r--r--@ 1 agrimm staff 6148 23 Mar 2010 .DS_Store drwxr-xr-x@ 3 agrimm staff 1530 23 Mar 2010 . drwxr-xr-x@ 30 agrimm staff 1054 20 Nov 14:43 .. Is it ok to do sudo chmod, or is there a safer approach? Background: Files within the original folder on my computer also had weird permissions - I suspect I may have used sudo to copy some files from a thumbdrive onto my computer.

    Read the article

  • How to migrate ScrapBook data to Evernote?

    - by Daren Thomas
    I have tons of info stored in a ScrapBook (the Firefox plugin) installation at home. But thats my problem: It's at home! I'd just love to be able to post all that stuff to Evernote. In fact, synching would be best, since ScrapBook has some really cool editing features for websites... Has anyone gone that route?

    Read the article

  • Testdisk won’t list files for an ext4 partition inside a LVM inside a LUKS partition

    - by user1598585
    I have accidentally deleted a file that I want to recover. The partition is an ext4 partition inside an LVM partition that is encrypted with dm-crypt/LUKS. The encrypted LUKS partition is: /dev/sda2 which contains a physical volume, with a single volume group, mapped to: /dev/mapper/system And the logical volume, the ext4 partition is mapped to: /dev/mapper/system-home A # testdisk /dev/mapper/system-home will notice it as an ext4 partition but tells me that the partition seems damaged when I try to list the files. If I # testdisk /dev/mapper/system it will detect all the partitions, but the same happens if I try to list their files. Am I doing something wrong or is it a known bug? I have searched but haven’t found any clue.

    Read the article

  • Sorting in Pivot Table on how data is summarized, not just the value

    - by user26453
    Often I am creating pivot tables that summarize some count by some category. Let's say I am counting Yes/No responses by some category. I usually add the count field and display it as a "% of row", and then create a pivot chart. However, if I want to sort one of the columns, say "Yes", Excel sorts by the underlying count, not the calculated percentage. Any way around this?

    Read the article

  • SQL Server 2008 login problem with ASP.NET application: Failed to open the explicitly specified data

    - by eulerfx
    I am running SQL Server 2008 Express Edition on Windows Server 2008 with an ASP.NET application which must access the server. The ASP.NET application is associated with an application pool that runs on the NetworkService account. This account in turn has a Login and User record on SQL Server in the required database. When I attempt to run the ASP.NET website I get a blank page and when viewed in the error log, I seem to be getting this information event record: Login failed for user 'NT AUTHORITY\NETWORK SERVICE'. Reason: Failed to open the explicitly specified database. [CLIENT: myLocalMachine] The connection string has Trusted_Connection=True; and the required database specified. When I explicitly specify the user name and password I get another login error stating the password is incorrect, even though the same un/pw combination works through SQL Server Management studio. The NETWORK SERVICE account seems to have all the required privileges for the database. Also, I made a test ASP.NET website project which does a simple select from a table in that database, and using the same config file I am not getting the error and it seems to work. Is it something to do with trust levels then, because the original ASP.NET web app references various DLLs including open source libraries. Also, the application does not seem to be able to write to the event log itself, throwing a security exception, even though everything in the config files, including machine.config states the app is in full trust.

    Read the article

  • Converting Future Composer FC14 tracker music to MIDI data

    - by okw
    I have an old .FC music file from the Amiga/C64 days. It was made using Future Composer. The first four bytes of the file read as FC14 in ASCII; I'm pretty sure that's the version number. I need to dump the channels and their notes into a standard MIDI file in order to play it through my MIDI devices. Is there a way to do this with existing programs? If not, are there any specifications available on the format of these files? I do not require the samples, and I am aware that they will be lost during the conversion process.

    Read the article

  • Problems configuring DB2 CLI/ODBC System DSN ODBC Data Source Administrator

    - by Komyg
    I am trying to create a System DSN ODBC connection to a DB2 9.5 database, but I am getting a very strange problem. I've looked through the internet and found the following page that has some instructions on how I should proceed: http://www.ryslander.com/how-to-install-and-configure-db2-odbc-driver/. I followed these instructions and I am able to create a new System DSN, however when I try to configure it it seems as if my configurations don't work at all. For example, when I click on the "Configure" button on my System DSN and I add a TCP/IP protocol configuration on the "Advanced Settings" tab and click "Ok", no errors appear, but when I click on "Configure" again my TCP/IP setting has vanished. This happened to all my other configurations, such as database name, username, password etc. Could you help me figure out what I am doing wrong? Note: my user is in the administrator group and I am using a Windows Server 2008 R2 Enterprise x64. UPDATE I managed to create a User DSN and connect to the database. However the problem with the System DSN remains.

    Read the article

  • How to recover a file using the FAT cluster chain instead of using the stored length in the FAT table?

    - by cadrian
    I'm trying to recover movie files from my TNT receiver hard drive but it corrupts its FAT32 allocation table (crappy cheap device...) Using dosfsck is useless because the correct file length is the cluster length, not the (shorter) one in the table, and dosfsck only proposes to shorten the file, which I won't do. Question: how to recover a file using the FAT cluster chain instead of using the stored length in the FAT table? Edit I forgot to say: Linux solutions only please (I have no windows box)

    Read the article

  • Recovering/Rebuilding MySQL .FRM, .MY* files from IBDATA1

    - by Synetech
    I recently had an incident in which several MySQL files were wiped out (mostly from WordPress, but also a few of MySQL’s own files). The IBDATA1 file is unaffected, but several .frm are gone as are a few .myi and .myd files. So now I need to find out if there is a way to rebuild the missing files from IBDATA1. I tried Googling it, assuming that such an issue has come up before, and indeed there were numerous search results (including this question), but all of the ones I looked at were the opposite, about recovering from .frm and .my* files or somehow required these files. Is there a way to rebuild these files? I know I have a relatively recent backup (a .SQL file) if there isn’t, but I’m hoping that these are the kind of files that are rebuilt if missing or outdated.

    Read the article

  • Data loss through permissions change?

    - by charliehorse55
    I seem to have deleted some files on my media drive, simply by changing the permissions. The Story I have many operating systems installed on my computer, and constantly switch between them. I bought a 1TB HD and formatted it as HFS+ (not journaled). It worked well between OSX and all of my linux installations while having much better metadata support than NTFS. I never synced the UIDs for my operating systems so the permissions were always doing funny things. Yesterday I tried to fix the permissions by first changing the UIDs of the other operating systems to match OSX, and then changing the file ownership of all files on the drive to match OSX. About 50% of the files on the drive were originally owned by OSX, the other half were owned by the various linux installations. I started to try and change the file permissions for the folders, and that's when it went south. The Commands These commands were run recursively on the one section of the drive. sudo chflags nouchg sudo chflags -N sudo chown myusername sudo chmod 666 sudo chgrp staff The Bad Sometime during the execution of these commands, all of the files belonging to OSX were deleted. If a folder had linux based files it would remain intact but any folder containing exclusively OSX files was erased. If a folder containing linux files also contained a subfolder with only OSX files, the sub folder would remain but is inaccesible and displays a file size of 0 bytes. Luckily these commands were only run on the videos folder, I also have a music folder with the same issue but I did not execute any of these commands on it. Effectively I have examples of the file permissions for all 3 states - the linux files before and after, and the OSX files before. OSX File Before -rw-r--r--@ 1 charliehorse 1000 3634241 15 Nov 2008 /path/to/file com.apple.FinderInfo 32 Linux File before: -rw-r--r--@ 1 charliehorse 1000 5321776 20 Sep 2002 /path/to/file/ com.apple.FinderInfo 32 Linux File After (Read only): (Different file, but I believe the same permissions originally) -rw-rw-rw-@ 1 charliehorse staff 366982610 17 Jun 2008 /path/to/file com.apple.FinderInfo 32 These files still exist so if there are any other commands to run on them to determine what has happened here, I can do that. EDIT Running ls on one of the "empty" deleted OSX folders yields this: ls: .: Permission denied ls: ..: Permission denied ls: subdirA: Permission denied ls: subdirB: Permission denied ls: subdirC: Permission denied ls: subdirD: Permission denied I believe my files might still be there, but the permissions are screwed.

    Read the article

  • SpinRite and USB blues - does a solution exist?

    - by Peter Mortensen
    I use SpinRite to recover hard disks and their content, and to a lesser degree for preventive maintenance. However, if a USB drive (USB thumb drive and/or external hard disk with a USB interface) is connected when SpinRite scans for devices, then SpinRite never finishes/hangs. The work-around is of course to disconnect the drive, but there is value in being able to use SpinRite on USB drives. Some external drives have no screws and it is difficult to take out the hard disk without damaging the casing. And for those that have it would save the disassembling time. Is there a way to fix this problem (e.g. BIOS changes or a modified SpinRite boot CD) without resorting to floppy disks?

    Read the article

< Previous Page | 245 246 247 248 249 250 251 252 253 254 255 256  | Next Page >