Search Results

Search found 80052 results on 3203 pages for 'data load performance'.

Page 403/3203 | < Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >

  • Shared data in an ASP.NET application

    - by Barguast
    I have a basic ASP.NET application which is used to request data which is stored on disk. This is loaded from files and sent as the response. I want to be able to store the data loaded from these files in memory to reduce the number of reads from disk. All of the requests will be asking for the same data, so it makes sense to have a single cache of in-memory data which is accessible to all requests. What is the best way to create a single accessible object instance which I can use to store and access this cached data? I've looked into HttpApplication, but apparently a new instance of this is created for parallel requests and so it doesn't fit my needs.

    Read the article

  • Android sql creating database once

    - by semajhan
    One thing i'm not understanding is how to create the database and data just once in an android application. I extend SQLiteOpenHelper and use a DataHelper class that to manipulate data. Now, I have addEvent() and updateEvent() within the DataHelper class. I create an instance of DataHelper in my Activity and addEvent() a couple of times to insert data. Well, now I don't know how to just do that once. If I restart the app, its just going to "addEvent()" again and so the data is being reset every time. Sorry for the probably really REALLY noob question. The only solution I found was not using the DataHelper class and just adding data "manually" within the onCreate() method in SQLiteOpenHelper.

    Read the article

  • Wordpress meta data is written on top of page instead of the loop

    - by Fruxelot
    i'm building a wordpress webpage based on the Skeleton Wordpress theme. I have 2 posts showing on a page and each of these posts have custom fields values (meta data). Im using the shortcode from the skeleton theme to get a post-feed from a specific category and in that loop i have inserted this tag that displays the custom fields data <?php the_meta(); ?> I am getting the data - but the problem is, the data is shown on TOP of the page instead of inside the in the post. What could've ive possibly done wrong? or is it something with skeleton i am doing wrong? Webpage : http://visbyfangelse.se.preview.binero.se/rum-priser-preview/ as you can see two posts are shown - and the meta data is shown on the top of the page. Code to the loop : http://pastebin.com/mRQY5GNz As you can see i want the meta displayed in the div which i assigned this class to "my_room_meta".

    Read the article

  • Micro Controller Serial Data identification or classification

    - by Posiedon
    I have a x51 family micro controller (P89V51RD2). I'm going to send some data from computer using serial port. The data i'll be sending are Character 'S' Character 'R' and a 2 digit integer. Upon receiving the data, I will be calling separate functions. I used if(chr=='S') and elseif(chr=='R')for character data. The main problem lies with identifying the 2 digit number sent. Any other data other than the above three mentioned will be discarded. Any ideas for identifying two digit integer ??

    Read the article

  • Sharing information between applications

    - by Zé Carlos
    My question is very simple to expose: I have a few aplications that share data between then. I need a way to support that data sharing (cross several computers) and when one aplication changes data, others should be notified about that. My question is about what tecnologies could be usefull to me. The solution i realise at this moment is to have a database to share data and an external publish-subscribe system (like http://pubsub.codeplex.com/) to notify all applications when the data is changed. But i belive that could exist some helpfull solutions. Do you know any of then? Thanks.

    Read the article

  • MySQL Insert Query Randomly Takes a Long Time

    - by ShimmerTroll
    I am using MySQL to manage session data for my PHP application. When testing the app, it is usually very quick and responsive. However, seemingly randomly the response will stall before finally completing after a few seconds. I have narrowed the problem down to the session write query which looks something like this: INSERT INTO Session VALUES('lvg0p9peb1vd55tue9nvh460a7', '1275704013', '') ON DUPLICATE KEY UPDATE sessAccess='1275704013',sessData=''; The slow query log has this information: Query_time: 0.524446 Lock_time: 0.000046 Rows_sent: 0 Rows_examined: 0 This happens about 1 out of every 10 times. The query usually only takes ~0.0044 sec. The table is InnoDB with about 60 rows. sessId is the primary key with a BTREE index. Since this is accessed on every page view, it is clearly not an acceptable execution time. Why is this happening? Update: Table schema is: sessId:varchar(32), sessAccess:int(10), sessData:text

    Read the article

  • How to reduce latency of data sent through a REST api

    - by Sid
    I have an application which obtains data in JSON format from one of our other servers. The problem I am facing is, there is is significant delay when when requesting for this information. Since a lot of data is passed (approx 1000 records per request where each record is pretty huge) is there a way that compression would help reducing the speed. If so which compression scheme would you recommend. I read on another thread that they pattern of data also matters a lot on they type of compression that needs to be used. The pattern of data is consistent and resembles the following :desc=>some_description :url=>some_url :content=>some_content :score=>some_score :more_attributes=>more_data Can someone recommend a solution to how I could reduce this delay. They delay is approx 6-8 seconds. I'm using Ruby on Rails to develop this application and the server providing the data uses Python for the most part.

    Read the article

  • web service data type (contract)

    - by cyberguest
    hi, i have a general design question. we have a fairly big data model that represents an clinical object, the object itself has 200+ child attributes in the hierarchy. and we have a SetObject operation, and a GetObject operation. my question is, best practice wise, would it make sense to use that single data model in both operations or different data model for each? Because the Get operation will return much more details than what's needed for Set. an example of what i mean: the data model has say ProviderId, and ProviderName attributes, in the Get operation, both the ProviderId, and ProviderName would need to be returned. However, in the Set operation, only the ProviderId is needed, and ProviderName is ignored by the service since system has that information already. In this case, if the Get and Set operations use the same data model, the ProviderName is exposed even for Set operation, does that confuse the consuming developer?

    Read the article

  • Catch/Raise event on table data update C#

    - by Incognito
    Hi, I have 24/7 service which keeps setup (configuration data) for charging, routing and etc in the Sql Server. Once it is started it loads the data from table using Linq2SQL and use the data through all the application. And we need a solution to update the setup data in the table without restarting the application. So I am interested is it possible to catch/determine that the table was updated so I can refresh the setup data in the application. I mean is it possible to have events which will raise when there is any delete, update or insert on the table. Thank you.

    Read the article

  • how divide herader from binary data

    - by fixo2020
    Hi, I have this code: ofstream dest("test.txt",ios::binary); while (true){ size_t retval = recv (sd, buffer, sizeof(buffer), 0); dest.write(buffer,retval); if(retval <= 0) { delete[] buffer; break;} } Now, the recv() function return 4 bytes each loop right? and buffer contain it, this return all data so, pseudo-header and binary data (image), but I want know how capture only binary data, I know that the end of header are "\n\r" right? but what's are the solution better for make this? I make a function that detect when are "\n\r"? and after how capture binary data? Or, I put all data in memory, and after parse it? but how? I'm desperate :(

    Read the article

  • Copy unmanaged data into managed array

    - by JeffRSon
    I need to copy native (i.e. unmanaged) data (byte*) to managed byte array with C++/CLI (array). I tried Marshal::Copy (data is pointed to by const void* data and is dataSize bytes) array<byte>^ _Data=gcnew array<byte>(dataSize); System::Runtime::InteropServices::Marshal::Copy((byte*)data, _Data, 0, dataSize); This gives error C2665: none of the 16 overloads can convert all parameters. Then I tried System::Runtime::InteropServices::Marshal::Copy(new IntPtr(data), _Data, 0, dataSize); which produces error C2664: parameter 1 cannot be converted from "const void*" to "__w64 int". So how can it be done and is Marshal::Copy indeed the "best" (simplest/fastest) way to do so?

    Read the article

  • Convert JSON data into String

    - by san6086
    Hi I am converting JSON data into String. Please find the JSON data below. I am facing an issue where in the system is unable to convert NULL values into string. Therefore, I am getting the following error: can't convert nil into String (TypeError) JSON DATA: {"success":true,"message":null,"data":null} Code Used: c = Curl::Easy.new(Configuration.fetch("<URL where we can find the above JSON DATA and nothing else>")) # c.follow_location = true # c.http_auth_types = :basic # c.username = Configuration.fetch('auth_user', false) # c.password = Configuration.fetch('auth_pass', false) # c.headers["User-Agent"] = 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.52 Safari/537.17' # c.perform result=JSON.parse(c) puts result["Success"] Please help.

    Read the article

  • Copying just the data from one Database to another

    - by monksy
    I'm not sure if this is the site for this question or not [if so put in the comment or vote to move it] How can I copy only the data from one database to another within the same server on SQL Server 2005? The two databases have the same schema but not the same data. I'm trying to get the data from one database to another. I am not able to restore from a snapshot [that screws over the security settings on the database]. I'm not able to use the import data wizard, because that is trying to copy over schema data as well.

    Read the article

  • PHP Losing variable data

    - by Conor B
    Hi, I'm having an issue with PHP losing data in a variable. There is quite a bit of data in the variable, because it basically contains a binary file, but I'm wondering if this is cause for it to completely lose it's information. Looking at a snippet from my code which is used to deal with email attachments: var_dump($data) if (array_key_exists('filename', $params) || array_key_exists('name', $params)) { var_dump($data) ... } The first var_dump gives the desired output of the file: "string(283155) " --Apple-Mail-5-930065543 ... etc while the second gives an output of: string(0) "" ... string(0) "" Any idea why this is happening? Does PHP just drop data in variables if they are really large? (I didn't think so, as I've never had this problem before) If so, any workaround? Thanks!

    Read the article

  • Is it possible to put a form control on its own thread?

    - by BVernon
    I'm using a DataGridView and some operations that I do cause it to become unresponsive for periods of time. Normally I would put data processing in its own thread to make the form more responsive, but in this case it's the DataGridView itself that's taking so long. This leads me to wonder whether it's possible to have the main form on one thread and the DataGridView on another thread so it doesn't prevent the main form from responding. I completely understand that doing so is probably not 'safe' and likely opens up a can of worms that makes it hardly worth trying and I fully expect this post will be getting down votes for merely suggesting such a ridiculous idea. Is this possible? And if so how would you go about it?

    Read the article

  • Short POST data in HTTP

    - by Matt
    We're hosting a customer's Debian Linux web server. It's running a PHP based web application. The server is sitting behind our firewall with it's own virtual interface and port 80 is forwarded internally to a machine sitting in the DMZ. The issue we're having is that when data is posted to the server it seems to be being cut short for some users. It's reproducable for some users on the same box. But the same user sending the same data on the same lan on another PC it works. The data gets cut to around 1140 bytes I'm told. Any idea why this might be happening? The customer is blaming our firewall, but then surely we'd have issues with other services. I'm suspecting it's a problem with the website itself. Suggestions on how to isolate the problem would be of help. Our firewall is Astaro. EDIT: A customer has set the ethernet frame size temporarily to 500bytes on the server. This made it work for now! I know some of the customers are using an internet provider that runs PPPoE

    Read the article

  • Exporting Client Data from Groupwise 6.5 to Outlook 2010 without Crashing

    - by Adam Doherty
    My employer has recently moved from Novell GroupWise 6.5 to Exchange 2010. We've imposed mailbox limits on staff but we still need to move their old messages, contacts, calendars, etc. over to Outlook 2010. Our problem however is this, utilizing the Novell MAPI client is slow within Outlook 2010 and upon exporting messages to a PST file (for later re-attachment, and offline backup purposes) crashes the GroupWise server. Connecting to the server in Outlook via IMAP to export messages to PST is faster and apparently more stable but also crashes the server. We'll be keeping our GroupWise server online internally until then end of the year but I have staff with mailboxes approaching 12 gigabytes, which is fine if we're going to move the data to offline storage (DVD set) but if I keep crashing the server every time I try to get the data I'll just be spinning my wheels. In my first attempts, I tried to move mail for a staff member with 3GB of data. The transfer lasted roughly 8 hours before crashing. I'm wondering if there is an open source solution to my problem. Paid solutions exist but we're a not-for-profit organization and have too many staff to justify the costs of per seat licenses just to migrate mail.

    Read the article

  • 24TB RAID 6 configuration

    - by Phil
    I am in charge of a new website in a niche industry that stores lots of data (10+ TB per client, growing to 2 or 3 clients soon). We are considering ordering about $5000 worth of 3TB drives (10 in a RAID 6 configuration and 10 for backup), which will give us approximately 24 TB of production storage. The data will be written once and remain unmodified for the lifetime of the website, so we only need to do a backup one time. I understand basic RAID theory, however I am not experienced with it. My question is, does this sound like a good configuration? What potential problems could this setup cause? Also, what is the best way to do a one-time backup? Have two RAID 6 arrays, one for offsite backup and one for production? Or should I backup the RAID 6 production array to a JBOD? EDIT: The data server is running Windows 2008 Server x64. EDIT 2: To reduce rebuild time, what would you think about using two RAID 5's instead of one RAID 6?

    Read the article

  • changing filesystem format from xfs to ext4 without losing data

    - by A.Rashad
    I have a fresh Lucid Lynx (Ubuntu 10.04) running on a laptop. where I defined the filesystems as: mount point / on ext4 (46 Gb) mount point /home on jfs (63 GB) swap as 3 Gb I left the machine over night to do some task, without AC power supply. next day in the morning I found it on standby, task completed, but filesystem was not reachable. it gave me I/O error it seems that there is a problem with jfs and standby. anyways, to avoid any hassle, I want to move this mount point from jfs format to ext4. can I do this without losing data and without the need to place the data in a temporary location until transformation is done? sorry to mention that, but I recall back in the windows days, we would change a FAT16 to FAT32 or a FAT32 to NTFS without having to lose the data. I hope this is available on Linux. Update The /home filesystem was xfs not jfs, and it seems there is a bug with this filesystem for some reason, I had to re-install the OS twice until I ended up with ext4 for the entire / However, as a conclusion, it seems that there is no way to make a conversion

    Read the article

  • Looking for recommendations on OCR problem - tabular numeric data

    - by ldigas
    I have 20 pages of experiment measurement data which I need to digitalize. The results are in tabular form, scanned in 600 dpi resolution, and as far as scans go, they came up pretty clean and readable. For an example of how it looks see here (but beware: it is a rather big scan; about 5Mb; no problem for any broadband connection, but dialups should approach with caution!) ... and I need it finished by sunday afternoon (:-o) <-- smiley in a state of panic (then why did't you start sooner?)... yea, yeah ... I know ... but, it came up late, and I wasn't thinking I was gonna need this data also. So, I'm looking for recommendations. I haven't much experience with OCR programs, save scanning a page or two of pure text, but just to mention, I haven't the wish also to test out every OCR program out there. So this isn't a "name your OCR favourite". What I'm looking is advice from someone who's done something like that, and his/hers experience on what would be the best way to undertake. I need the data in txt form but since it will have to be checked (by drawing it, and just simply watching whether some points "jump out") I'll probably be entering it in Excel at first.

    Read the article

  • Western Digitial My Book: Can't access the data on the drive

    - by Bryan Denny
    My girlfriend has this external hard drive by Western Digital called a My Book. When the external drive is connected, it does not show it as an accessible disk drive on the computer. However, it shows up fine in Device Manager: I can also see it in Disk Management, but the volume is not mapped to a drive letter, nor can I change the drive letter: It only gives me access to Delete Volume: I would rather not lose the data on the drive if possible. What can I do from here to get to the data? Things I've tried/know: Uninstall drivers and re-install them Device does the same thing when attach to either her Win7 laptop or my Win8 laptop I don't think there's an issue with the HDD itself. No clicking noises, etc. I ran Western Digital Data LifeGuard Diangostics (DLGDIAG) and the SMART Status was a "PASS", all of the SMART Disk Information looked fine. I haven't had the time to run the diag tests yet but I do not believe it's a mechanical issue. The hard drive is inside of an enclosure, I have not attempted to pry the drive out yet. How can I get Windows to properly detect this drive?

    Read the article

  • How can I solve display glitches and poor performance with ATI fglrx driver on my ThinkPad X100e?

    - by rewbs
    I noticed that video performance on my Thinkpad X100e was very poor compared to Windows 7, so I installed the ATI fglrx proprietary drivers by using the "Additional Drivers" dialogue box. The system has an ATI Radeon Mobility HD 3200 chip. The result of installing the drivers is pretty devastatingly negative, with symptoms such as skewed content in windows, browser tabs and text boxes failing to refresh when their content changes. In fact, please excuse typos in this post, because I can't really see what I am typing. :) I also notice that HD video playback performance is no better - perhaps even worse - than prior to installing the drivers. Example of what I see: Here's the output of fglrxinfo: display: :0.0 screen: 0 OpenGL vendor string: ATI Technologies Inc. OpenGL renderer string: ATI Radeon HD 3200 Graphics OpenGL version string: 3.3.10237 Compatibility Profile Context Output of lspci | grep -i vga: 01:05.0 VGA compatible controller: ATI Technologies Inc RS780M/RS780MN [Radeon HD 3200 Graphics] I'm on Ubuntu 10.10 with kernel 2.6.35-22-generic-pae. What can I try? Many thanks, -R

    Read the article

  • Divide pivot table data by an arbitrary column in another table

    - by rsavu
    Hello all, I have this data from a pivot table: Countries P1 P2 Total Country 1 10 69 Country 2 36 2 92 Country 3 21 24 100 Country 4 22 77 Country 5 13 79 Country 6 12 1 48 Country 7 14 29 Country 8 22 1 46 Country 9 4 1 31 Country 10 16 7 120 Country 11 25 2 114 Country 12 8 11 68 Country 13 5 27 Country 14 11 3 23 Country 15 6 19 Country 16 33 79 Where: 1st column is the country name 2nd and 3rd column are the tickets introduced in the system 4th column is the total (disregard the data - total is not accurate) Additionally, I have another table that looks like this: Country P1 P2 Country 1 2 3 Country 2 2 2 Country 3 0 2 Country 4 0 3 Country 5 1 1 Country 6 2 2 Country 7 1 2 Country 8 3 3 Country 9 1 4 Country 10 2 1 Country 11 4 2 Country 12 2 1 Country 13 3 2 Country 14 3 3 Country 15 1 2 Country 16 2 2 Where the data represents the number of users of the application in each country. I want to be able to show the number of tickets submitted divided by the number of users in each country. Any ideeas how to do that? Thank you very much, Razvan

    Read the article

  • SQL Error (1064) when importing data from SQL file

    - by mejpark
    I have a MySQL database, which was originally set up with the default latin1 character set and latin1_swedish_ci collation. I was using the database like this for sometime, until I noticed strange characters on my production web site, which is powered by a database exported from my development machine. At this point, I changed the default character set of the database and tables to utf8 and the collation to utf8_unicode_ci, converted the latin1 data inside each table to utf8 (using the 'convert data' option) and exported the database as a single SQL file using HeidiSQL. When the resulting SQL file is opened in Notepad++, several characters are rendered incorrectly. For example, en dashes (-) are displayed as – and e with accent (é) are displayed as é. I changed the encoding of the file from ANSI to UTF-8 (using the encoding menu option in Notepad++) and the offending characters are rendered correctly. I saved the new utf8-encoded SQL file and attempted to import the contents into the MySQL database on my production server. The import process fails with following error: /* SQL Error (1064): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?# -------------------------------------------------------- # Host: ' at line 1 */ /* Error with snippets directory: The specified path was not found */ The head of the SQL file: # -------------------------------------------------------- # Host: 127.0.0.1 # Server version: 5.1.33-community # Server OS: Win32 # HeidiSQL version: 6.0.0.3773 # Date/time: 2011-04-20 09:48:36 # -------------------------------------------------------- It chokes on the first line of the file, which is commented out. Why is this happening? I didn't have a problem loading data from SQL files until I changed the character set and collation of the database. I came up with an ugly workaround to this problem by performing following steps: Export database as single SQL file using HeidiSQL Open resulting file in Notepad++ and convert from ANSI to UTF-8 encoding Create new empty file in Notepad++, paste in UTF-8 and save file normally What am I missing here?

    Read the article

  • How To Replace Laptop HDD Without Losing Data?

    - by Ishan
    Hello, I recently went to Dell Service center and they tell that HDD is faulty and needs to be replaced. I have a Studio 1457 laptop with 500 GB HDD and don't want to lose the data(purchased in May 2010, still under warranty). I have searched a bit and I think it may be best to use a disk imaging software for this task. However, I don't know about a good software. I have following steps in mind: Get a 1 TB External HDD. Make an image of existing 500 GB HDD and store data on external disk. Install new HDD and install a brand new Windows copy and then install the software on it. Using the same software I used to make image, restore the old HDD image on new one. However, I have some questions in mind. First, is this possible? Second, I live in a country where piracy is a big issue and I am sure the support executive who will come to change HDD will have a pirated copy. But I have genuine Windows 7 Pro and don't want to lose it. Now, Dell does not supply and OS disks, so I can't install it on new HDD! If I follow above steps, which version of Windows 7 will be retained? One in the image(authentic) or one in the new HDD(pirated). I am ready to purchase a good software for this task and my budget is $50-60. Since laptop is under warranty, new HDD will be free. One last thing, I have created a Windows Migration file whose size is 70 GB. Can it be used to move from Windows 7 Pro to Windows 7 Pro?(In case I get a genuine copy of Windows 7!) Any other method to save all the data? Thanks in advance.

    Read the article

< Previous Page | 399 400 401 402 403 404 405 406 407 408 409 410  | Next Page >