Search Results

Search found 3856 results on 155 pages for 'io'.

Page 23/155 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • UVA Online Judge 3n+1 : Right answer is Wrong answer

    - by Samuraisoulification
    Ive been toying with this problem for more than a week now, I have optimized it a lot, I seem to be getting the right answer, since it's the same as when I compare it to other's answers that got accepted, but I keep getting wrong answer. Im not sure what's going on! Anyone have any advice? I think it's a problem with the input or the output, cause Im not exactly sure how this judge thing works. So if anyone could pinpoint the problem, and also give me any advice on my code, Id be very appreciative!!! #include <iostream> #include <cstdlib> #include <stdio.h> #include <vector> using namespace std; class Node{ // node for each number that has teh cycles and number private: int number; int cycles; bool cycleset; // so it knows whether to re-set the cycle public: Node(int num){ number = num; cycles = 0; cycleset = false; } int getnumber(){ return number; } int getcycles(){ return cycles; } void setnumber(int num){ number = num; } void setcycles(int num){ cycles = num; cycleset = true; } bool cycled(){ return cycleset; } }; class Cycler{ private: vector<Node> cycleArray; int biggest; int cycleReal(unsigned int number){ // actually cycles through the number int cycles = 1; if (number != 1) { if (number < 1000000) { // makes sure it's in vector bounds if (!cycleArray[number].cycled()) { // sees if it's been cycled if (number % 2 == 0) { cycles += this->cycleReal((number / 2)); } else { cycles += this->cycleReal((3 * number) + 1); } } else { // if cycled get the number of cycles and don't re-calculate, ends recursion cycles = cycleArray[number].getcycles(); } } else { // continues recursing if it's too big for the vector if (number % 2 == 0) { cycles += this->cycleReal((number / 2)); } else { cycles += this->cycleReal((3 * number) + 1); } } } if(number < 1000000){ // sets cycles table for the number in the vector if (!cycleArray[number].cycled()) { cycleArray[number].setcycles(cycles); } } return cycles; } public: Cycler(){ biggest = 0; for(int i = 0; i < 1000000; i++){ // initialize the vector, set the numbers Node temp(i); cycleArray.push_back(temp); } } int cycle(int start, int end){ // cycles thorugh the inputted numbers. int size = 0; for(int i = start; i < end ; i++){ size = this->cycleReal(i); if(size > biggest){ biggest = size; } } int temp = biggest; biggest = 0; return temp; } int getBiggest(){ return biggest; } }; int main() { Cycler testCycler; int i, j; while(cin>>i>>j){ //read in untill \n int biggest = 0; if(i > j){ biggest = testCycler.cycle(j, i); }else{ biggest = testCycler.cycle(i, j); } cout << i << " " << j << " " << biggest << endl; } return 0; }

    Read the article

  • Ruby Thread with "watchdog"

    - by Sergio Campamá
    I'm implementing a ruby server for handling sockets being created from GPRS modules. The thing is that when the module powers down, there's no indication that the socket closed. I'm doing threads to handle multiple sockets with the same server. What I'm asking is this: Is there a way to use a timer inside a thread, reset it after every socket input, and that if it hits the timeout, closes the thread? Where can I find more information about this? EDIT: Code example that doesn't detect the socket closing require 'socket' server = TCPServer.open(41000) loop do Thread.start(server.accept) do |client| puts "Client connected" begin loop do line = client.readline open('log.txt', 'a') { |f| f.puts line.strip } end rescue puts "Client disconnected" end end end

    Read the article

  • Cannot write to SD card -- canWrite is returning false

    - by Fizz
    Sorry for the ambiguous title but I'm doing the following to write a simple string to a file: try { File root = Environment.getExternalStorageDirectory(); if (root.canWrite()){ System.out.println("Can write."); File def_file = new File(root, "default.txt"); FileWriter fw = new FileWriter(def_file); BufferedWriter out = new BufferedWriter(fw); String defbuf = "default"; out.write(defbuf); out.flush(); out.close(); } else System.out.println("Can't write."); }catch (IOException e) { e.printStackTrace(); } But root.canWrite() seems to be returning false everytime. I am not running this off of an emulator, I have my android Eris plugged into my computer via USB and running the app off of my phone via Eclipse. Is there a way of giving my app permission so this doesn't happen? Also, this code seems to be create the file default.txt but what if it already exists, will it ignore the creation and just open it to write or do I have to catch something like FileAlreadyExists(if such an exception exists) which then just opens it and writes? Thanks for any help guys.

    Read the article

  • C# - Problem while listing directories - DirectoryNotFoundException

    - by HoNgOuRu
    I'm getting a "DirectoryNotFoundException" error, here is the code: string directorio = "D:\MUSICA\La Trampa - El Mísero Espiral De Encanto"; DirectoryInfo dir = new DirectoryInfo(directorio); DirectoryInfo[] dirs = dir.GetDirectories(); <------------This is the line I'm having this problem. I believe it's caused when it tries to parse the tilde part of that string "Mísero". the directory "D:\MUSICA\La Trampa - El Mísero Espiral De Encanto" exists because I can see it and also have some files in it. Is there any way to send this string in correct way? Thanks

    Read the article

  • shell_exec() Doesn't Show The Output

    - by Nathan Campos
    I'm doing a PHP site that uses a shell_exec() function like this: $file = "upload/" . $_FILES["file"]["name"]; $output = shell_exec("leaf $file"); echo "<pre>$output</pre>"; Where leaf is a program that is located in the same directory of my script, but when I tried to run this script on the server, I just got nothing. What is wrong?

    Read the article

  • Execute a Application On The Server Using JavaScript

    - by Nathan Campos
    I have an application on my server that is called leaf.exe, that haves two arguments needed to run, they are: inputfile and outputfile, that will be like this example: pnote.exe input.pnt output.txt They are all on the same directory as my home page file(the executable and the input file). But I need that a JavaScript could run the application like that, then I want to know how could I do this.

    Read the article

  • C# "Could not find a part of the path" - Creating Local File

    - by Pyronaut
    I am trying to write to a folder that is located on my C:\ drive. I keep getting the error of : Could not find a part of the path .. etc My filepath looks basically like this : C:\WebRoot\ManagedFiles\folder\thumbs\5c27a312-343e-4bdf-b294-0d599330c42d\Image\lighthouse.jpg And I am writing to it like so : using (MemoryStream memoryStream = new MemoryStream()) { thumbImage.Save(memoryStream, ImageFormat.Jpeg); using (FileStream diskCacheStream = new FileStream(cachePath, FileMode.CreateNew)) { memoryStream.WriteTo(diskCacheStream); } memoryStream.WriteTo(context.Response.OutputStream); } Don't worry too much about the memory stream. It is just outputting it (After I save it). Since I am creating a file, I am a bit perplexed as to why it cannot find the file (Shouldn't it just write to where I tell it to, regardless?). The strange thing is, It has no issue when I'm testing it above using File.Exists. Obviously that is returning false, But it means that atleast my Filepath is semi legit. Any help is much appreciated.

    Read the article

  • Online Image Slideshow Question. File Access Problems.

    - by msandbot
    Hi, I have a flash .swf file that I embed on my webpage. On my server I have the .swf file and multiple image folders. I would like to load every file in one of those folders into the flash slideshow. How should I go about doing this? I tried used Air but it doesn't work on my system as an application so I doubt it will work online. Eventually I plan on making a menu where you can select different folders to display and since they are of different sizes, a foreach loop would be optimal. Keeping a txt file with the number of images is also possible if theres a way to read that in, but I would prefer the more dynamic approach. I am working towards using php for the website if that helps find a solution. Thanks, -Mike Also my slideshow works great online currently but i have to hardcode in the number of files.

    Read the article

  • Second Thread Holding Up Entire Program in C# Windows Form Application

    - by Brandon
    In my windows form application, I'm trying to test the user's ability to access a remote machine's shared folder. The way I'm doing this (and I'm sure that there are better ways...but I don't know of them) is to check for the existence of a specific directory on the remote machine (I'm doing this because of firewall/other security restrictions that I'm confronted with in my organization). If the user has rights to access the shared folder, then it returns in no time at all, but if they don't, it hangs forever. To solve this, I threw the check into another thread and wait only 1000 milliseconds before determining that the share can't be hit by the user. However, when I do this, it still hangs as if it was never run in the same thread. What is making it hang and how do I fix it? I would think that the fact that it is in a separate thread would allow me to just let the thread finish on it's own in the background. Here is my code: bool canHitInstallPath = false; Thread thread = new Thread(new ThreadStart(() => { canHitInstallPath = Directory.Exists(compInfo.InstallPath); })); thread.Start(); thread.Join(1000); if (canHitInstallPath == false) { throw new Exception("Cannot hit folder: " + compInfo.InstallPath); }

    Read the article

  • java.util.zip - ZipInputStream v.s. ZipFile

    - by lucho
    Hello, community! I have some general questions regarding the java.util.zip library. What we basically do is an import and an export of many small components. Previously these components were imported and exported using a single big file, e.g.: <component-type-a id="1"/> <component-type-a id="2"/> <component-type-a id="N"/> <component-type-b id="1"/> <component-type-b id="2"/> <component-type-b id="N"/> Please note that the order of the components during import is relevant. Now every component should occupy its own file which should be externally versioned, QA-ed, bla, bla. We decided that the output of our export should be a zip file (with all these files in) and the input of our import should be a similar zip file. We do not want to explode the zip in our system. We do not want opening separate streams for each of the small files. My current questions: Q1. May the ZipInputStream guarantee that the zip entries (the little files) will be read in the same order in which they were inserted by our export that uses ZipOutputStream? I assume reading is something like: ZipInputStream zis = new ZipInputStream(new BufferedInputStream(fis)); ZipEntry entry; while((entry = zis.getNextEntry()) != null) { //read from zis until available } I know that the central zip directory is put at the end of the zip file but nevertheless the file entries inside have sequential order. I also know that relying on the order is an ugly idea but I just want to have all the facts in mind. Q2. If I use ZipFile (which I prefer) what is the performance impact of calling getInputStream() hundreds of times? Will it be much slower than the ZipInputStream solution? The zip is opened only once and ZipFile is backed by RandomAccessFile - is this correct? I assume reading is something like: ZipFile zipfile = new ZipFile(argv[0]); Enumeration e = zipfile.entries();//TODO: assure the order of the entries while(e.hasMoreElements()) { entry = (ZipEntry) e.nextElement(); is = zipfile.getInputStream(entry)); } Q3. Are the input streams retrieved from the same ZipFile thread safe (e.g. may I read different entries in different threads simultaneously)? Any performance penalties? Thanks for your answers!

    Read the article

  • Creating and writing file from a FileOutputStream in Java

    - by Althane
    Okay, so I'm working on a project where I use a Java program to initiate a socket connection between two classes (a FileSender and FileReceiver). My basic idea was that the FileSender would look like this: try { writer = new DataOutputStream(connect.getOutputStream()); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } //While we have bytes to send while(filein.available() >0){ //We write them out to our buffer writer.write(filein.read(outBuffer)); writer.flush(); } //Then close our filein filein.close(); //And then our socket; connect.close(); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); The constructor contains code that checks to see if the file exists, and that the socket is connected, and so on. Inside my FileReader is this though: input = recvSocket.accept(); BufferedReader br = new BufferedReader(new InputStreamReader(input.getInputStream())); FileOutputStream fOut= new FileOutputStream(filename); String line = br.readLine(); while(line != null){ fOut.write(line.getBytes()); fOut.flush(); line = br.readLine(); } System.out.println("Before RECV close statements"); fOut.close(); input.close(); recvSocket.close(); System.out.println("After RECV clsoe statements"); All inside a try-catch block. So, what I'm trying to do is have the FileSender reading in the file, converting to bytes, sending and flushing it out. FileReceiver, then reads in the bytes, writes to the fileOut, flushes, and continues waiting for more. I make sure to close all the things that I open, so... here comes the weird part. When I try and open the created text file in Eclipse, it tells me "An SWT error has occured ... recommended to exit the workbench... see .log for more details.". Another window pops up saying "Unhandled event loop exception, (no more handles)". However, if I try to open the sent text file in notepad2, I get ThisIsASentTextfile Which is good (well, minus the fact that there should be line breaks, but I'm working on that...). Does anyone know why this is happening? And while we're checking, how to add the line breaks? (And is this a particularly bad way to transfer files over java without getting some other libraries?)

    Read the article

  • How can I find file system concurrency issues ?

    - by krosenvold
    I have an application running on Linux, and I find myself wanting windows (!). The problem is that every 1000 times or so I run into concurrency problems that are consistent with concurrent reading/writing of files. I am fairly sure that this behavior would be prohibited by file locking under Windows, but I don't have any sufficiently fast windows box to check. There is simply too much file access (too much data) to expect strace to work reliably - the sheer volume of output is likely to change the problem too. It also happens on different files every time. Ideally I would like to change/reconfigure the linux file system to be more restrictive (as in fail-fast) wrt concurrent access. Are there any tools/settings I can use to achieve this ?

    Read the article

  • File sizing issue in DOS/FAT

    - by Heather
    I've been tasked with writing a data collection program for a Unitech HT630, which runs a proprietary DOS operating system that can run executables compiled for 16-bit MS DOS with some restrictions. I'm using the Digital Mars C/C++ compiler, which is working well thus far. One of the application requirements is that the data file must be human-readable plain text, meaning the file can be imported into Excel or opened by Notepad. I'm using a variable length record format much like CSV that I've successfully implemented using the C standard library file I/O functions. When saving a record, I have to calculate whether the updated record is larger or smaller than the version of the record currently in the data file. If larger, I first shift all records immediately after the current record forward by the size difference calculated before saving the updated record. EOF is extended automatically by the OS to accommodate the extra data. If smaller, I shift all records backwards by my calculated offset. This is working well, however I have found no way to modify the EOF marker or file size to ignore the data after the end of the last record. Most of the time records will grow in size because the data collection program will be filling some of the empty fields with data when saving a record. Records will only shrink in size when a correction is made on an existing entry, or on a normal record save if the descriptive data in the record is longer than what the program reads in memory. In the situation of a shrinking record, after the last record in the file I'm left with whatever data was sitting there before the shift. I have been writing an EOF delimiter into the file after a "shrinking record save" to signal where the end of my records are and space-filling the remaining data, but then I no longer have a clean file until a "growing record save" extends the size of the file over the space-filled area. The truncate() function in unistd.h does not work (I'm now thinking this is for *nix flavors only?). One proposed solution I've seen involves creating a second file and writing all the data you wish to save into that file, and then deleting the original. Since I only have 4MB worth of disk space to use, this works if the file size is less than 2MB minus the size of my program executable and configuration files, but would fail otherwise. It is very likely that when this goes into production, users would end up with a file exceeding 2MB in size. I've looked at Ralph Brown's Interrupt List and the interrupt reference in IBM PC Assembly Language and Programming and I can't seem to find anything to update the file size or similar. Is reducing a file's size without creating a second file even possible in DOS?

    Read the article

  • Write physical table to csv file

    - by urema
    Hi, I was wondering if anyone knows how to write an actual table/grid to a csv file....i dont mean the content of the table/grid, i mean the actual grid lines etc etc, headers, axis..... Thanks greatly in advance. U.

    Read the article

  • Writing the output of IEnumerable<XElement> to a XML File

    - by Googler
    HI folks , This is my code to read two xml files and merge their elemnts. I want to write the output to an xml file. This is my code. IEnumerable<XElement> list0 = doc.Descendants(node1).Concat(doc2.Descendants(node2)); foreach (XElement el in list0) Console.WriteLine(el); Instead of writing to the console i need to write it to a xml file. Output is also in the xml format.How to achieve this. Can anyone pls give me a code or method to achiev this.

    Read the article

  • Very weird C file-handling anomaly

    - by KáGé
    Hello, I got a very weird issue that I cant figure out in my school project, which is the simulation of a simple filesystem in a human-readable textfile. Unfortunately I don't yet have enough time to translate the comments in my code or make it less gibberish, so if you are bothered by that, you don't have to help, I understand. See the code HERE. Now in drive.h, at line 574 is this part: i = getline(); #ifdef DEBUG printf("Free space in all found at %d.\n\n", i); if(drive.disk != NULL){ printf("Disk OK\n\n"); } #endif //write in data state = seekline(i); Before this it finds place for the allocation database entry in the ALL sector (see the "image files" in the mounts folder, this issue was tested on mount_30.efs-dbf), then gets the line with i = getline() fine (getline is in lglobal.h, line 39), but after that any file manipulation (in this case seekline's fseek, but if I comment that out, then the first fprintf after that) crashes the program straight away. I think the file gets somehow corrupted (though the Disk OK message appears) but can't figure out how. I've tried putting i = getline(); into comment, but it didn't make any difference. I've also tried asking at local programming forums but they didn't really help either. The last few lines of the output before it crashes: Dir written. (drive.h line 562) Seekline entered: 268 (called at drive.h line 564) Getline entered. (called at drive.h line 574) Line got: 268. Free space in all found at 268. (drive.h line 576) Seekline entered: 268 (called at drive.h line 582, note that this exact call was run successfully less than 20 lines back. This one should set the pointer to the beginning of the line it is currently in) After this it crashes. Does anyone has any idea of what causes this and how could I fix it? Thank you.

    Read the article

  • Determine which process (b)locks a file, programmatically (under Windows >= XP)

    - by fred-hh
    How to programmatically determine from a process P, which other process P' has a lock on a file, that prevents P from recreating that file ? I know there are tools to do that, but how do they achieve that ? (Context: a batch program that runs overnight fails because of a locked file. Running an admin tool the next day may be too late to get useful information. So it would be nice if the batch program itself was able to determine the culprit.) EDIT: Added complexity: the file resides on a DFS and P' might not run on the same machine as P (but maybe does). But a solution that works locally would be a good beginning.

    Read the article

  • PHP: What is an efficient way to parse a text file containing very long lines?

    - by Shaun
    I'm working on a parser in php which is designed to extract MySQL records out of a text file. A particular line might begin with a string corresponding to which table the records (rows) need to be inserted into, followed by the records themselves. The records are delimited by a backslash and the fields (columns) are separated by commas. For the sake of simplicity, let's assume that we have a table representing people in our database, with fields being First Name, Last Name, and Occupation. Thus, one line of the file might be as follows [People] = "\Han,Solo,Smuggler\Luke,Skywalker,Jedi..." Where the ellipses (...) could be additional people. One straightforward approach might be to use fgets() to extract a line from the file, and use preg_match() to extract the table name, records, and fields from that line. However, let's suppose that we have an awful lot of Star Wars characters to track. So many, in fact, that this line ends up being 200,000+ characters/bytes long. In such a case, taking the above approach to extract the database information seems a bit inefficient. You have to first read hundreds of thousands of characters into memory, then read back over those same characters to find regex matches. Is there a way, similar to the Java String next(String pattern) method of the Scanner class constructed using a file, that allows you to match patterns in-line while scanning through the file? The idea is that you don't have to scan through the same text twice (to read it from the file into a string, and then to match patterns) or store the text redundantly in memory (in both the file line string and the matched patterns). Would this even yield a significant increase in performance? It's hard to tell exactly what PHP or Java are doing behind the scenes.

    Read the article

  • Is there a Base64Stream for .NET? where?

    - by Cheeso
    If I want to produce a Base64-encoded output, how would I do that in .NET? I know that since .NET 2.0, there is the ICryptoTransform interface, and the ToBase64Transform() and FromBase64Transform() implementations of that interface. But those classes are embedded into the System.Security namespace, and require the use of a TransformBlock, TransformFinalBlock, and so on. Is there an easier way to base64 encode a stream of data in .NET?

    Read the article

  • Delete temp file during finally vs delete output file during catch

    - by Russell
    This is in Java 6. I've seen more than once that people create temp files, do something, then rename it to the output file. Everything is wrapped in a try-finally block, where the temp file is deleted in finally in case something goes wrong in between. try { //do something with tempFile //do something with tempFile //do something with tempFile tempFile.renameTo(outputFile); } finally { if (tempFile.exists()) tempFile.delete() } I was wondering what are the benefits of doing that instead of doing something to the output file directly and delete it in case of exceptions. try { //do something with outputFile //do something with outputFile //do something with outputFile } catch (Exception e) { if (outputFile.exists()) outputFile.delete(); } My guess is that deleting temp files in finally benefits me when the try block can throw many kinds of exceptions. Is my guess right? What else?

    Read the article

  • Need a way to determine if a file is done being written to.

    - by Khorkrak
    The situation I'm in is this - there's a process that's writing to a file, sometimes the file is rather large say 400 - 500MB. I need to know when it's done writing. How can I determine this? If I look in the directory I'll see it there but it might not be done being written. Plus this needs to be done remotely - as in on the same internal LAN but not on the same computer and typically the process that wants to know when the file writing is done is running on a Linux box with a the process that's writing the file and the file itself on a windows box. No samba isn't an option. xmlrpc communication to a service on that windows box is an option as well as using snmp to check if that's viable. Ideally Works on either Linux or Windows - meaning the solution is OS independent. Works for any type of file. Good enough: Works just on windows but can be done through some library or whatever that can be accessed with Python. Works only for PDF files. Current best idea is to periodically open the file in question from some process on the windows box and look at the last bytes checking for the PDF end tag and accounting for the eol differences because the file may have been created on Linux or Windows.

    Read the article

  • What's the correct way to do a "catch all" error check on an fstream output operation?

    - by Truncheon
    What's the correct way to check for a general error when sending data to an fstream? UPDATE: My main concern regards some things I've been hearing about a delay between output and any data being physically written to the hard disk. My assumption was that the command "save_file_obj << save_str" would only send data to some kind of buffer and that the following check "if (save_file_obj.bad())" would not be any use in determining if there was an OS or hardware problem. I just wanted to know what was the definitive "catch all" way to send a string to a file and check to make certain that it was written to the disk, before carrying out any following actions such as closing the program. I have the following code... int Saver::output() { save_file_handle.open(file_name.c_str()); if (save_file_handle.is_open()) { save_file_handle << save_str.c_str(); if (save_file_handle.bad()) { x_message("Error - failed to save file"); return 0; } save_file_handle.close(); if (save_file_handle.bad()) { x_message("Error - failed to save file"); return 0; } return 1; } else { x_message("Error - couldn't open save file"); return 0; } }

    Read the article

  • How do I turn an array of bytes back into a file and open it automatically with C#?

    - by Ace Grace
    Hi, I am writing some code to add file attachments into an application I am building. I have add & Remove working but I don't know where to start to implement open. I have an array of bytes (from a table field) and I don't know how to make it automatically open e.g. If I have an array of bytes which is a PDF, how do I get my app to automatically open Acrobat or whatever the currently assigned application for the extension is using C#?

    Read the article

  • Android USB Host Communication

    - by Kip Russell
    I'm working on a project that utilizes the USB Host capabilities in Android 3.2. I'm suffering from a deplorable lack of knowledge and talent regarding USB/Serial communication in general. I'm also unable to find any good example code for what I need to do. I need to read from a USB Communication Device. Ex: When I connect via Putty (on my PC) I enter: >GO And the device starts spewing out data for me. Pitch/Roll/Temp/Checksum. Ex: $R1.217P-0.986T26.3*60 $R1.217P-0.986T26.3*60 $R1.217P-0.987T26.3*61 $R1.217P-0.986T26.3*60 $R1.217P-0.985T26.3*63 I can send the initial 'GO' command from the Android device at which time I receive an echo of 'GO'. Then nothing else on any subsequent reads. How can I: 1) Send the 'go' command. 2) Read in the stream of data that results. The USB device I'm working with has the following interfaces (endpoints). Device Class: Communication Device (0x2) Interfaces: Interface #0 Class: Communication Device (0x2) Endpoint #0 Direction: Inbound (0x80) Type: Intrrupt (0x3) Poll Interval: 255 Max Packet Size: 32 Attributes: 000000011 Interface #1 Class: Communication Device Class (CDC) (0xa) Endpoint #0 Address: 129 Number: 1 Direction: Inbound (0x80) Type: Bulk (0x2) Poll Interval (0) Max Packet Size: 32 Attributes: 000000010 Endpoint #1 Address: 2 Number: 2 Direction: Outbound (0x0) Type: Bulk (0x2) Poll Interval (0) Max Packet Size: 32 Attributes: 000000010 I'm able to deal with permission, connect to the device, find the correct interface and assign the endpoints. I'm just having trouble figuring out which technique to use to send the initial command read the ensuing data. I'm tried different combinations of bulkTransfer and controlTransfer with no luck. Thanks. I'm using interface#1 as seen below: public AcmDevice(UsbDeviceConnection usbDeviceConnection, UsbInterface usbInterface) { Preconditions.checkState(usbDeviceConnection.claimInterface(usbInterface, true)); this.usbDeviceConnection = usbDeviceConnection; UsbEndpoint epOut = null; UsbEndpoint epIn = null; // look for our bulk endpoints for (int i = 0; i < usbInterface.getEndpointCount(); i++) { UsbEndpoint ep = usbInterface.getEndpoint(i); Log.d(TAG, "EP " + i + ": " + ep.getType()); if (ep.getType() == UsbConstants.USB_ENDPOINT_XFER_BULK) { if (ep.getDirection() == UsbConstants.USB_DIR_OUT) { epOut = ep; } else if (ep.getDirection() == UsbConstants.USB_DIR_IN) { epIn = ep; } } } if (epOut == null || epIn == null) { throw new IllegalArgumentException("Not all endpoints found."); } AcmReader acmReader = new AcmReader(usbDeviceConnection, epIn); AcmWriter acmWriter = new AcmWriter(usbDeviceConnection, epOut); reader = new BufferedReader(acmReader); writer = new BufferedWriter(acmWriter); }

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >