Search Results

Search found 8692 results on 348 pages for 'per magnusson'.

Page 67/348 | < Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >

  • C# - periodic data reading and Thread.Sleep()

    - by CaldonCZE
    Hello, my C# application reads data from special USB device. The data are read as so-called "messages", each of them having 24 bytes. The amount of messages that must be read per second may differ (maximal frequency is quite high, about 700 messages per second), but the application must read them all. The only way to read the messages is by calling function "ReadMessage", that returns one message read from the device. The function is from external DLL and I cannot modify it. My solution: I've got a seperate thread, that is running all the time during the program run and it's only job is to read the messages in cycle. The received messages are then processed in main application thread. The function executed in the "reading thread" is the following: private void ReadingThreadFunction() { int cycleCount; try { while (this.keepReceivingMessages) { cycleCount++; TRxMsg receivedMessage; ReadMessage(devHandle, out receivedMessage); //...do something with the message... } } catch { //... catch exception if reading failed... } } This solution works fine and all messages are correctly received. However, the application consumes too much resources, the CPU of my computer runs at more than 80%. Therefore I'd like to reduce it. Thanks to the "cycleCount" variable I know that the "cycling speed" of the thread is about 40 000 cycles per second. This is unnecessarily too much, since I need to receive maximum 700 messagges/sec. (and the device has buffer for about 100 messages, so the cycle speed can be even a little lower) I tried to reduce the cycle speed by suspending the thread for 1 ms by Thread.Sleep(1); command. Of course, this didn't work and the cycle speed became about 70 cycles/second which was not enough to read all messages. I know that this attempt was silly, that putting the thread to sleep and then waking him up takes much longer than 1 ms. However, I don't know what else to do: Is there some other way how to slow the thread execution down (to reduce CPU consumption) other than Thread.Sleep? Or am I completely wrong and should I use something different for this task instead of Thread, maybe Threading.Timer or ThreadPool? Thanks a lot in advance for all suggestions. This is my first question here and I'm a beginner at using threads, so please excuse me if it's not clear enough.

    Read the article

  • Fastest way for inserting very large number of records into a Table in SQL

    - by Irchi
    The problem is, we have a huge number of records (more than a million) to be inserted into a single table from a Java application. The records are created by the Java code, it's not a move from another table, so INSERT/SELECT won't help. Currently, my bottleneck is the INSERT statements. I'm using PreparedStatement to speed-up the process, but I can't get more than 50 recods per second on a normal server. The table is not complicated at all, and there are no indexes defined on it. The process takes too long, and the time it takes will make problems. What can I do to get the maximum speed (INSERT per second) possible? Database: MS SQL 2008. Application: Java-based, using Microsoft JDBC driver.

    Read the article

  • Is it possible to get collection of some ejb`s instances from container?

    - by kislo_metal
    Hi! Scenario: I have some @Statefull bean for user session (not an http session, it is web services session). And I need to manage user`s session per user. Goal: I need to have possibility to get collection of @Statefull UserSession`s instances and control maximum number of session`s per user, and session`s life time. Q: Is it possible to get Collection of ejb`s instances from ejb container, instead of storing them in some collection, map etc. ? I am using glassfish v3 , ejb 3.1, jax-ws. Thank You!

    Read the article

  • Inner join 2 tables one to many 2 where clauses

    - by user2892350
    I'm a relative rookie at this,so please bear with me... I have 2 tables: OrderDetail and OrderMaster...both have a column named SalesOrder. OrderDetail table has multiple rows per unique SalesOrder. OrderMaster table has one row per unique SalesOrder. OrderDetail has a column named LineType. OrderMaster has a column named OrderStatus. I want to select all records from OrderDetail that have a LineType of "1" AND whose matching SalesOrder line in the OrderMaster table has a OrderStatus column value of "4". In plain English, orders with a Status 4 are open and ready to ship, LineType value of 1 means the Detail Line is a product code. How should this query be structured? It's going into VS 2008 (VB). Many thanks in advance!!! Mike

    Read the article

  • Compressing digitalized document images

    - by Adabada
    Hello, We are now required by law to digitalize all the financial documents in our company and submit them to evaluations every 3 months. Since this is sensitive data we decided to take matters into our own hands and build some sort of digital data archiver. The tool works perfectly, but after 7 months of usage we are begining to worry about the disk space used by these images. Here some info on the amount of documents digitalized: 15K documents scanned and archived per day, with final PNG size of +- 860KB: 15 000 * 860 kilobits = 1.53779984 gigabytes 30 days of work per month: 1.53779984 gigabytes * 30 = 46.1339952 gigabytes Expectation of disk space usage after 1 year: 46.1339952 gigabytes * 12 = 553.607942 gigabytes So far we're at 424 gigabytes of disk space used, without counting backup. We're using PNG as image format, but I would like to know if anyone have any advice on a better compression algorithm for images or alternative strategies for compressing the PNG's even more or even better ways to archive images as to save disk space. Any help would be appreciated, thanks.

    Read the article

  • script to delete all /n number of lines starting from a word except last line

    - by akvikram
    how to delete all lines below a word except last line in a file. suppose i have a file which contains | 02/04/2010 07:24:20 | 20-24 | 26 | 13 | 2.60 | | 02/04/2010 07:24:25 | 25-29 | 6 | 3 | 0.60 | +---------------------+-------+------------+----------+-------------+ 02-04-2010-07:24 --- ER GW 03 +---------------------+-------+------------+----------+-------------+ | date | sec | BOTH_MO_MT | MO_or_MT | TPS_PER_SEC | +---------------------+-------+------------+----------+-------------+ | 02/04/2010 07:00:00 | 00-04 | 28 | 14 | 2.80 | | 02/04/2010 07:00:05 | 05-09 | 27 | 14 | 2.70 | ... ... ... ... END OF TPS PER 5 REPORT and i need to delete all contents from "02-04-2010-07:24 --- ER GW 03" except "END OF TPS PER 5 REPORT" and save the file. This has to be done for around 700 files. all files are same format, with datemonthday filename.

    Read the article

  • What technologies to use for a particle system with enormous calculation demand?

    - by Amir Rezaei
    I have a particle system with X particles. Each particle tests for collision with other particles. This gives X*X = X^2 collision tests per frame. For 60f/s, this corresponds to 60*X^2 collision detection per second. What is the best technological approach for these intensive calculations? Should I use F#, C, C++ or C#, or something else? The following are constraints The code is written in C# with the latest XNA Multi-threaded may be considered No special algorithm that tests the collision with the nearest neighbors or that reduces the problem The last constraint may be strange, so let me explain. Regardless constraint 3, given a problem with enormous computational requirement what would be the best approach to solve the problem. An algorithm reduces the problem; still the same algorithm may behave different depending on technology. Consider pros and cons of CLR vs native C.

    Read the article

  • Comparing 2 linq applications: Unexpected result

    - by lukesky
    I drafted 2 ASP.NET applications using LINQ. One connects to MS SQL Server, another to some proprietary memory structure. Both applications work with tables of 3 int fields, having 500 000 records (the memory structure is identical to SQL Server table). The controls used are regular: GridView and ObjectDataSource. In the applications I calculate the average time needed for each paging click processing. LINQ + MS SQL application demands 0.1 sec per page change. LINQ + Memory Structure demands 0.8 sec per page change. This Is shocking result. Why the application handling data in memory works 8 times slower than the application using hard drive? Can anybody tell me why that happens?

    Read the article

  • Need advice on cron job'ing a very large process

    - by Arms
    I have a PHP script that grabs data from an external service and saves data to my database. I need this script to run once every minute for every user in the system (of which I expect to be thousands). My question is, what's the most efficient way to run this per user, per minute? At first I thought I would have a function that grabs all the user Ids from my database, iterate over the ids and perform the task for each one, but I think that as the number of users grow, this will take longer, and no longer fall within 1 minute intervals. Perhaps I should queue the user Ids, and perform the task individually for each one? In which case, I'm actually unsure of how to proceed. Thanks in advance for any advice.

    Read the article

  • Is there a way to customise messages produced by statements in MS SQL Query Analyzer?

    - by Scott Leis
    If I run a simple query in SQL Query Analyzer, like: SELECT * FROM TableName the Messages pane always produces a message like: (30 row(s) affected) If I run a stored procedure with many statements, the messages are useless because there's no indication of what each one relates to. So firstly: Is there a way to customise the default messages on a per-query basis? E.g. I'd like a specific query to produce a message like: TableName query produced [numRowsAffected] results. replacing [numRowsAffected] with the number that would have appeared in the default message. Secondly, is there a way to suppress the default messages on a per-query basis? E.g. I have a local variable of type TABLE, used in several statements. I don't want any message to appear for statements where I'm just deleting data from that variable before re-using it. I'm seeking solutions that work in SQL Server 8.0.

    Read the article

  • Retrieve Access Control List of Documents for Specific User on Google Docs?

    - by viatropos
    I have a large website at www.mydomain.com. There are 1000 new documents per month and 100 new users per week lets say. I need to be able to programmatically do the following: user goes to www.mydomain.com/documents user sees list of all documents they have access to (not ALL of the docs) I know you can retrieve an ACL for each document individually. But is there a way to retrieve an ACL for each "user", a list of all the documents they have access to, in one HTTP request? Something like this, but for Docs (and not just for document "owners"): Retrieving only calendars that a user owns. I'd love to know, because it seems like I'd have to parse 10,000 document "entry" tags, find the ACL, see if user is in ACL... That seems crazy. What am I missing? Thanks!

    Read the article

  • Which database should I use for best performance

    - by _simon_
    Hello, I am working in Visual Studio 2005, .NET 2.0. I need to write an application, which listens on COM port and saves incoming data to a database. Main feature: save incoming data (series of 13-digits numbers), if this number allready exists, then mark it as double. For example, there could be these records in database: 0000000000001 OK 0000000000002 OK 0000000000002 Double 0000000000003 OK 0000000000004 OK I could use SQL database, but I don't know if it is fast enough... Database should be able to store up to 10.000.000 records and write up to 100 records per minute (so it needs to check 100 times per minute if this record allready exists). Which database should I use? Maybe the whole database would need to be in RAM. Where could I learn more about this? Thanks

    Read the article

  • Unique visitor counting in ASP.NET MVC

    - by Max
    I'd like to do visitor tracking similar to how stackoverflow does it.. By reading through numerous posts, I've figured out some details already: Count only 1 IP hit per 15 minutes (if anonymous) Count only 1 unique user-Login (per day?) Now that leaves the question of the real implementation.. Should I log the two factors live into a table (and increase count) | IP | timestamp | pageurl | Or do the counting AFTERWARDS (e.g. using IIS log files - which don't include the user, right? I know there're some similar posts outside, but NONE really has a great solution in my opinion yet..

    Read the article

  • iPhone "multi-threading" question

    - by MrDatabase
    I have a simple iPhone game consisting of two "threads": the main game loop where all updating and rendering happen 30 times per second (NSTimer)... and the "thread" that calls the accelerometer delegate 100 times per second. I have a variable "xPosition" that's updated in the accelerometer delegate function and used in the game loop. Is there a possibility of the two "threads" trying to use xPosition at the same time (hence causing a crash or some other problem). If so how can I fix this w/ minimal impact to the game's performance? I've been using this set-up for many months of development and incremental testing and I've never run into any problems. Cheers!

    Read the article

  • Reserved Memory Addresses?

    - by Nate
    Is there a list of reserved memory addresses out there - a list of addresses that the memory of a user-space program could never be allocated to? I realize this is most likely per-OS or per-architecture, but I was hoping someone might know some of the more common OSes and Arches. I could only dig one up for a few versions of windows: for windows NT,2k and XP that would be: 0x00000000 - 0x0000ffff - lowest page is protected to simplify debugging 0x00001000 - 0x7ffeffff - memory area for your application 0x7fff0000 - 0x7fffffff - protected area to keep memory-functions from damaging the following part 0x80000000 - 0xffffffff - memory where the system including drivers and so on is located Anyone know about for Linux, or BSD (or anything else, for that matter)?

    Read the article

  • MySQL Full-text Search Workaround for innoDB tables

    - by Rob
    I'm designing an internal web application that uses MySQL as its backend database. The integrity of the data is crucial, so I am using the innoDB engine for its foreign key constraint features. I want to do a full-text search of one type of records, and that is not supported natively with innoDB tables. I'm not willing to move to MyISAM tables due to their lack of foreign key support and due to the fact that their locking is per table, not per row. Would it be bad practice to create a mirrored table of the records I need to search using the MyISAM engine and use that for the full-text search? This way I'm just searching a copy of the data and if anything happens to that data it's not as big of a deal because it can always be re-created. Or is this an awkward way of doing this that should be avoided? Thanks.

    Read the article

  • Fast, cross-platform timer?

    - by dsimcha
    I'm looking to improve the D garbage collector by adding some heuristics to avoid garbage collection runs that are unlikely to result in significant freeing. One heuristic I'd like to add is that GC should not be run more than once per X amount of time (maybe once per second or so). To do this I need a timer with the following properties: It must be able to grab the correct time with minimal overhead. Calling core.stdc.time takes an amount of time roughly equivalent to a small memory allocation, so it's not a good option. Ideally, should be cross-platform (both OS and CPU), for maintenance simplicity. Super high resolution isn't terribly important. If the times are accurate to maybe 1/4 of a second, that's good enough. Must work in a multithreaded/multi-CPU context. The x86 rdtsc instruction won't work.

    Read the article

  • Create a PHP cache system in MySQL database?

    - by Zach Smith
    I'm creating a web service that often scrapes data from remote web pages. After scraping this data, I have a simple multidimensional array of information to use. The scraping process is fairly taxing on my server, and the page load takes a while. I was considering adding a simple cache system using a MySQL database, where I create one row per remote web page with a the array of information pulled from it stored as a JSON encoded string. Is this a good enough system? Or would something like a text file per web page be a better idea?

    Read the article

  • MySQL - accessing a table sum and compare to another table?

    - by assignment_operator
    This is for a homework assignment. I just plain don't understand how to do it. The instructions for this particular question is: List the branch name for all branches that have at least one book that has at least 4 copies on hand. Where the tables in question are: Branch: BranchName | BranchId Henry Downtown | 1 16 Riverview | 2 Henry On The Hill | 3 Inventory: BookId | BranchId | OnHand 1 | 1 | 2 2 | 3 | 4 3 | 1 | 8 4 | 3 | 1 5 | 1 | 2 6 | 2 | 3 From what I understand, I can get the number of OnHand per branch name with: SELECT BranchName, SUM(OnHand) FROM Branch B, Inventory I WHERE B.BranchId = I.BranchId GROUP BY BranchName; but I don't get how I'd do the comparison between the sum of OnHand per branch and 4. Any help would be appreciated, guys!

    Read the article

  • attaching multiple files to a domain class

    - by Emyr
    I've seen various Grails plugins which allow easier handling of file uploads, however these tend only to support a single file per form-submit. I'd like a multi-attach form where as soon as you pick one file, an extra field and button is added using JS (various sites do it like this). Do you know of any good plugins which provide elegant uploading of multiple files without excessive coding? A progress bar either per-file of for the whole process would also be very nice. I don't know to what extent I can allow GORM to handle a java.io.File field (or in this case a Collection<File>).

    Read the article

  • Caching for database questions.

    - by SeanD
    When we say caching like using memcahe or Redis, is this a 1:1 caching between the user and the cache or can we cache 1 item and use it for all user? Some items like a Friend list will be 1:1 a that is unique per user. But if i want to cache the auto complete list for city lookups which can be used by any user, will it just store 1 list in the cache used by all users at same time or doe it need to store 1 list per user? Is it possible to cache the entire database, all the lookups, all the users, all their photos, etc using memache or redis? So from the above example: a friend list will be cleared from the cache when the user logs off. But something like city auto complete will stay in the cache 24-7-365, am i correct?

    Read the article

  • Sum variable range of cells using "today's" date as starting point.

    - by Jason
    How do you sum a variable range of cells based upon today's date in MS Excel 2003. Spreadsheet format: Variable range = # of days to sum Date range = listed in row 1, 1 day per cell (example A1=1/1/10, B1=1/2/10, C1=1/3/10....) Numbers to be summed - listed in row 2, X number per cell (example A2=8, B2=6, C2=1.....) example problem: IF variable range = 2 & Current Date = 1/2/10 then...Sum(b2:c2)=7 I am able to sum the entire row based upon current date using the following formula but am not able to add the variable range to the sum function. =SUMIF(A1:C1,"="&TODAY(),A2:C2)

    Read the article

  • Is it possible to make an online registration process for a course in Google Docs with a cap on registrations?

    - by user61551
    I am trying to build an online registration form for an institute I intern at that is running a training programme with multiple courses and hundreds of applicants. I've done registration forms with Google Docs Forms before, but the tricky spot this time round is that I'm trying to put a registration cap on each of the courses. Every person needs to register for one course in every timeslot, and there are seven or so timeslots. However, as seating space in every venue is limited, we would like to cap registrations per course (venue) to 200 per course, after which either a) the applicant will be informed that they need to register for another course in that timeslot if they apply to a full course, or b) ideal case scenario, the option will be greyed out. Is this possible using Google Docs Forms at all? If not, is there a free alternative that might do the job properly?

    Read the article

< Previous Page | 63 64 65 66 67 68 69 70 71 72 73 74  | Next Page >