Search Results

Search found 6317 results on 253 pages for 'persistent storage'.

Page 198/253 | < Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >

  • Treating a fat webservice in .net 3.5 c#

    - by Chris M
    I'm dealing with an obese 3rd party webservice that returns about 3mb of data for a simple search results, about 50% of the data in that response is junk. Would it make sense then to remap this data to my own result object and ditch the response so I'm storing 1-2 mb in memory for filtering and sorting rather than using the web-responses own object and using 2-4 or am I missing a point? So far I've been accessing the webservice from a separate project and using a new class to provide the interaction and to handle the persistence so my project looks like this |- Web (mvc2 proj) |- DAL (database/storage fluent-nhibernate) |- SVCGateway (interaction layer + webservice related models) |- Services -------------- |- Tests |- Specs I'm trying to make the application behave fast and I also need to store the result set temporarily in case a customer goes to view the product and wants to go back to the results. (Service returns only 500 of possible 14K results). So basically I'm looking for confirmation that I'm doing the right thing in pushing the results into my own objects or if I'm breaking some rule or even if there's a better way of handling it. Thanks

    Read the article

  • (NOT) NULL for NVARCHAR columns

    - by Anders Abel
    Allowing NULL values on a column is normally done to allow the absense of a value to be represented. When using NVARCHAR there is aldready a possibility to have an empty string, without setting the column to NULL. In most cases I cannot see a semantical difference between an NVARCHAR with an empty string and a NULL value for such a column. Setting the column as NOT NULL saves me from having to deal with the possibility of NULL values in the code and it feels better to not have to different representations of "no value" (NULL or an empty string). Will I run into any other problems by setting my NVARCHAR columns to NOT NULL. Performance? Storage size? Anything I've overlooked on the usage of the values in the client code?

    Read the article

  • Giving writing permissions for IIS user at Windows 2003 Server

    - by Steve
    I am running a website over Windows 2003 Server and IIS6 and I am having problems to write or delete files in some temporary folder obtaining this kind of warmings: Warning: unlink(C:\Inetpub\wwwroot\cakephp\app\tmp\cache\persistent\myapp_cake_core_cake_): Permission denied in C:\Inetpub\wwwroot\cakephp\lib\Cake\Cache\Engine\FileEngine.php on line 254 I went to the tmp directory and at the properties I gave the IIS User the following permissions: Read & Execute List folder Contents Read And it still showing the same warnings. When I am on the properties window, if I click on Advanced the IIS username appears twice. One with Allow type and read & execute permissions and the other with Deny type and Special permissions. My question is: Should I give this user not only the Read & Execute permissions but also this ones?: Create Attributes Create Files/ Write Data Create Folders/ Append Data Delete Subfolders and Files Delete They are available to select if I Click on the edit button over the username. Wouldn't I be opening a security hole if I do this? Otherwise, how can I do to read and delete the files my website uses? Thanks.

    Read the article

  • Any strategies for assessing the trade-off between CPU loss and memory gain from compression of data

    - by indiehacker
    Are very large TextProperties a burden? Should they be compressed? Say I have a information stored in 2 attributes of type TextProperty in my datastore entities. The strings are always the same length of 65,000 characters and have lots of repeating integers, a sample appearing as follows: entity.pixel_idx = 0,0,0,0,0,0,0,0,0,1,1,1,1,1,1,1,1,1,1,1,5,5,5,5,5,5,5,5,5,5,5,5....etc. entity.pixel_color = 2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,2,1,1,1,1,1,1,1,1,1,1,1,1,...etc. So these above could also be represented using much less storage memory by compressing say using only each integer and the length of its series ( '0,8' for '0,0,0,0,0,0,0,0') but then its takes time and CPU to compress and decompress? Any general ideas? Are there some tricks for testing different attempts to the problem?

    Read the article

  • How can I persist a large Perl object for re-use between runs?

    - by Alnitak
    I've got a large XML file, which takes over 40 seconds to parse with XML::Simple. I'd like to be able to cache the resulting parsed object so that on the next run I can just retrieve the parsed object and not reparse the whole file. I've looked at using Data::Dumper but the documentation is a bit lacking on how to store and retrieve its output from disk files. Other classes I've looked at (e.g. Cache::Cache appear designed for storage of many small objects, not a single large one. Can anyone recommend a module designed for this? EDIT. The XML file is ftp://ftp.rfc-editor.org/in-notes/rfc-index.xml On my Mac Pro benchmark figures for reading the entire file with XML::Simple vs Storable are: s/iter test1 test2 test1 47.8 -- -100% test2 0.148 32185% --

    Read the article

  • Getting Older PHP Extensions

    - by maSnun
    Hello All, It happens that I need the GD, SQLite and a few more extensions for php 5.1. But I can't find out where to get them. I am using WinBinder to develop some desktop applications for Windows with php. The minimal php 5.1 pack has the winbinder extension only. I need other extensions for enhanced features like image editing or data storage. Can anybody help? I really need this very much. Thanks and Regards, Masnun

    Read the article

  • python dictionary with constant value-type

    - by s.kap
    hi there, I bumped into a case where I need a big (=huge) python dictionary, which turned to be quite memory-consuming. However, since all of the values are of a single type (long) - as well as the keys, I figured I can use python (or numpy, doesn't really matter) array for the values ; and wrap the needed interface (in: x ; out: d[x]) with an object which actually uses these arrays for the keys and values storage. I can use a index-conversion object (input -- index, of 1..n, where n is the different-values counter), and return array[index]. I can elaborate on some techniques of how to implement such an indexing-methods with reasonable memory requirement, it works and even pretty good. However, I wonder if there is such a data-structure-object already exists (in python, or wrapped to python from C/++), in any package (I checked collections, and some Google searches). Any comment will be welcome, thanks.

    Read the article

  • Core Data and iTunes File Sharing - Move/hide the .sqlite file on app update?

    - by Eric
    I have an iPad app that uses Core Data for data storage. I would like to enable file sharing in iTunes and I don't really want the users to be able to delete or modify the .sqlite file. Can I move the file to a different, hidden directory? Alternatively, could the file be made read-only? I wouldn't mind users having access to the file as long as it couldn't be changed. I suspect there is a trivial solution that is escaping me at the moment.

    Read the article

  • LINQ to SQL - Insert yielding strange behavior.

    - by Isaac
    Hi, I'm trying to insert several newly created items to the database. I have a LINQ2SQL generated class called "Order". Inside order, there's a property called "OrderItems" which is also generated by LINQ2SQL and represents the Items of that Order. So far so good. The problem I'm having right now, is when I try to add more than one newly created OrderItem inside Order. I.E: Order o = orderWorker.GetById( 10 ); for( int i=0; i < 5; ++i ) { OrderItem oi =new OrderItem { Order = order, Price = 100, ShippingPrice = 100, ShippingMethod = ... }; o.OrderItems.Add( oi ); } context.SubmitChanges(); Unfortunately, only a single entity is being added. Yes, I checked the generated SQL by adding Context.Log = Console.Out, and yes, only one statement was created. Any clues? By the way I know I'm not using InsertOnSubmit, by the documentation says: You can explicitly request Inserts by using InsertOnSubmit. Alternatively, LINQ to SQL can infer Inserts by finding objects connected to one of the known objects that must be updated. For example, if you add an Untracked object to an EntitySet(TEntity) or set an EntityRef(TEntity) to an Untracked object, you make the Untracked object reachable by way of tracked objects in the graph. While processing SubmitChanges, LINQ to SQL traverses the tracked objects and discovers any reachable persistent objects that are not tracked. Such objects are candidates for insertion into the database. Thank you very much for your time.

    Read the article

  • e-commerce product data/metadata schemas

    - by Shreko
    Trying to figure out how is product data/metadata schema designed. For example, how does an e-commerce site enter a product spec. Does it copy and paste from mfg spec sheet, enters it in their own fields or something else? Here is an example, looking at the D3000 Nikon DSLR Manufacturer: http://nikon.ca/en/Product.aspx?m=17300&disp=Specs futureshop.ca: www.futureshop.ca/en-CA/product/nikon-nikon-d3000-10-2mp-dslr-camera-with-18-55mm-lens-kit-d3000/10128435.aspx?path=865c2348a1542e848982c9dbd9253483en02 memoryexpress.com: www.memoryexpress.com/Products/PID-MX25539%28ME%29.aspx They are all slightly different in order or in parent/child field? What's storage is used for this type of info rdbms or xml?

    Read the article

  • How can I implement "real time" messaging on Google AppEngine?

    - by Freed
    I'm creating a web application on Google AppEngine where I want the user to be notified a quickly as possible after certain events occour. The problem is similar to say a chat server in that I need something happening on one connection (someone is writing a message in a chat room) to propagate to a number of other connections (other people in that chat room gets the message). To get speedy updates from the server to the client I'm planning on using long polling with XmlHttpRequest, hoping that AppEngine won't interfere other than possibly restriing the timeout. The real problem however is efficient notification between connections on AppEngine. Is there any support for this type of cross connection notification on AppEngine that does not involve busy-waiting? The only tools I can think of to do this at all is either using the data storage (slow) or memcache (unreliable), and none of them would let me avoid busy-waiting. Note: I know about XMPP support on AppEngine. It's related, but I want a browser based solution, sending messages to the users by XMPP is not an option.

    Read the article

  • MySQL: Efficient Blobbing?

    - by feklee
    I'm dealing with blobs of up to - I estimate - about 100 kilo bytes in size. The data is compressed already. Storage engine: InnoDB on MySQL 5.1 Frontend: PHP (Symfony with Propel ORM) Some questions: I've read somewhere that it's not good to update blobs, because it leads to reallocation, fragmentation, and thus bad performance. Is that true? Any reference on this? Initially the blobs get constructed by appending data chunks. Each chunk is up to 16 kilo bytes in size. Is it more efficient to use a separate chunk table instead, for example with fields as below? parent_id, position, chunk Then, to get the entire blob, one would do something like: SELECT GROUP_CONCAT(chunk ORDER BY position) FROM chunks WHERE parent_id = 187 The result would be used in a PHP script. Is there any difference between the types of blobs, aside from the size needed for meta data, which should be negligible.

    Read the article

  • Fast (de)serialization on iPhone

    - by Jacob Kuypers
    I'm developing a game/engine for iPhone OS. It's the first time I'm using Objective-C. I made my own binary format for geometry data and for textures I'm focusing on PVRTC. That should be the optimal approach as far as speed and space are concerned. I really want to keep loading time to a minimum and - if possible - be able to save very fast as well. So now I'm trying to make my "Entity" stuff persistent without sacrificing performance. First I wanted to use NSKeyedArchiver. From what I've heard, it's not very fast. Also, what I want to serialize is mostly structs made of floats with some ints and strings, so there isn't really a need for all that "object graph" overhead. NSArchiver would have been more appropriate, but they kicked that off the iphone for some reason. So now I'm thinking about making my own serialization scheme again. Am I wrong in thinking that NSKeyedArchiver is slow (I only read that, haven't tested it myself)? If so, what's the best way to encode/decode structs (with no pointers, mostly floats) without sacrificing speed?

    Read the article

  • Sync offline JQuery Mobile form

    - by Dimas
    My name is Dimas and I work as a developer in a hospital. I've developed a web form with JQuery Mobile which is accessed from an Android tablet. It sends data (text and FILES) to the server (PHP) and it's saved in a MySQL database. This tablets are used by doctors that are visiting the patients in their homes. Some of these patients live in remote areas with very bad Internet connection, so I'm thinking on use HTML5 Offline feature to solve these situations. I started reading about it but I'm newbie with this technology. I've some questions: Do you think this is the best/easiest solution? If I could save the form data in a local sqlite storage when the tablet is offline, how I could synchronize it with the data in MySQL server when the tablet become online? Thanks

    Read the article

  • How do I implement Hibernate Pagination using a cursor (so the results stay consistent, despite new

    - by hunterae
    Hey all, Is there any way to maintain a database cursor using Hibernate between web requests? Basically, I'm trying to implement pagination, but the data that is being paged is consistently changing (i.e. new records are added into the database). We are trying to set it up such that when you do your initial search (returning a maximum of 5000 results), and you page through the results, those same records always appear on the same page (i.e. we're not continuously running the query each time next and previous page buttons are clicked). The way we're currently implementing this is by merely selecting 5000 (at most) primary keys from the table we're paging, storing those keys in memory, and then just using 20 primary keys at a time to fetch their details from the database. However, we want to get away from having to store these keys in memory and would much prefer a database cursor that we just keep going back to and moving backwards and forwards over the cursor to generate pages. I tried doing this with Hibernate's ScrollableResults but found that I could not call methods like next() and previous() would cause an exception if you within a different web request / Hibernate session (no surprise there). Is there any way to reattach a ScrollableResults object to a Session, much the same way you would reattach a detached database object to make it persistent? Are there any other approaches to implement this data paging with consistent paging results without caching the primary keys?

    Read the article

  • How to write custom SQLite functions in Javascript inside a Webkit browser?

    - by Jay Godse
    I have just learned how to use the SQLite database for local storage in a Webkit web browser (e.g. Google Chrome or Apple Safari) using the Javascript API. For example the "Sticky Notes" application. However, I know that SQLite has a function called sqlite_create_function() that lets you add custom functions to your instance of SQLite on the fly which can then be used inside SQL queries. This function is described at sqlite.org. I also know that you can call an equivalent of this API in Ruby as described here. QUESTION: Can anybody show me how to do this in Javascript - i.e. write a custom function in Javascript that can be bound into the SQLite database at run time to be called by the SQLite engine, and all inside a Webkit browser?

    Read the article

  • how to play an encrypted file in Android.

    - by user306517
    I need to be able to play an encrypted file in Android. The file is AAC. The only way I can see to do this is either: decrypt the file to internal private storage and point the player at that file to play, or decrypt & decode the file to pcm and feed it to an AudioTrack. 1 isn't great because it takes a long time to do that. 2 isn't great either because I don't know how I can take advantage of the HW decoder to do this. Any ideas? tia.

    Read the article

  • Rails - Paperclip, getting width and height of image in model

    - by Corey Tenold
    Trying to get the width and height of the uploaded image while still in the model on the initial save. Any way to do this? Here's the snippet of code I've been testing with from my model. Of course it fails on "instance.photo_width". has_attached_file :photo, :styles => { :original => "634x471>", :thumb => Proc.new { |instance| ratio = instance.photo_width/instance.photo_height min_width = 142 min_height = 119 if ratio > 1 final_height = min_height final_width = final_height * ratio else final_width = min_width final_height = final_width * ratio end "#{final_width}x#{final_height}" } }, :storage => :s3, :s3_credentials => "#{RAILS_ROOT}/config/s3.yml", :path => ":attachment/:id/:style.:extension", :bucket => 'foo_bucket' So I'm basically trying to do this to get a custom thumbnail width and height based on the initial image dimensions. Any ideas?

    Read the article

  • Write Local File using Adobe Air Iframe

    - by user290687
    This is driving me mad. I am creating an AIR application and everything is working great. However I would really like to have a form inside an Iframe that when the user clicks submit saves the file to local application storage directory. Right now I am able to do this and save the file with no problems when I just access the HTML page not inside an Iframe. However if I wrap the page in an iframe and hit submit the file does not save. Any code examples would be very much appreciated. When I am using the iframe my code looks as follows<iframe src="jobs/newjob.html" height="800px" width="800px" sandboxRoot="app:/" documentRoot="app:/sandbox"ondominitialize="setupBridge ()">

    Read the article

  • Android and PHP - Do I need to use sessions?

    - by jtnire
    I have created an Android App that communicates with a PHP web server. They both send JSON to each other. My App is almost finished, however there is one thing left to do: authentication. Since the user's username and password will be stored in Android SharedPreferences, is there any need to use PHP sessions, given that the user won't need to enter the username/password at every request? Since I can just send the username and password in the HTTP POST header for every request, and that I will be using SSL, is this sufficient? I guess I could add an extra field in the header called 'random' that just adds a random value, just to use as a salt so that the encrypted SSL payload will be different everytime. The reason why I don't want to use sessions is that my Android App would either have to handle cookies, or managed the storage of the session ID. If there are some serious cons to using my method above, then I'm more than happy to use sessions, however all advice is appreciated. Thanks

    Read the article

  • Automatically update audit information on Entity

    - by Nix
    I have an entity model that has audit information on every table (50+ tables) CreateDate CreateUser UpdateDate UpdateUser Currently we are programatically updating audit information. Ex: if(changed){ entity.UpdatedOn = DateTime.Now; entity.UpdatedBy = Environment.UserName; context.SaveChanges(); } But I am looking for a more automated solution. During save changes, if an entity is created/updated I would like to automatically update these fields before sending them to the database for storage. Any suggestion on how i could do this? Let me know if any more information is needed.

    Read the article

  • How to scale thumbnail to fit depending on tall or wide photo - Attachment_fu

    - by adamwstl
    I'm using attachment_fu and Rmagick to create thumbnails after upload. The thumbnail is a fixed 135 x 135 px and I'm currently forcing the width to 135px on all photos. The problem is that if it's a wide and fat photo is has to stretch the height awkwardly. Current Attachment_fu setup class PhotoImage < Image belongs_to :photo has_attachment :content_type => :image, :size => 0..5.megabytes, :storage => :s3, :resize_to => '650x>', :thumbnails => { :thumbnail => '135x>' }#:geometry => 'x50' } validates_as_attachment end Here's what I'm trying to do: Thanks

    Read the article

  • Cross-platform HTML application options

    - by Charles
    I'd like to develop a stand-alone desktop application targeting Windows (XP through 7) and Mac (Tiger through Snow Leopard), and if possible iPhone and Android. In order to make it all work with as much common code as possible (and because it's the only thing I'm good at), I'd like to handle the main logic with HTML and JS. Using Adobe AIR is a possibility. And I think I can do this with various application wrappers, using .NET for Windows XP, Objective C for iPhone, Java for Android and native "widget" platform support for Mac and Windows Vista & 7 (though I'd like to keep the widget in the foreground, so the Mac dashboard isn't ideal). Does anyone have any suggestions on where to start? The two sticking points are: I'll certainly need some form of persistent storage (cookies perhaps) to keep state between sessions I'll also probably need access to remote data files, so if I use AJAX and the hosting HTML file resides on the device, it will need to be able to do cross-domain requests. I've done this on the iPhone without any problems, but I'd be surprised if this were possible on other platforms. For me, Android and iPhone will be the easiest to handle, and it looks like I can use Adobe AIR to handle the rest. But I wanted to know if there are any other alternatives. Does anyone have any suggesions?

    Read the article

  • Reporting system for organization. Architecture advise required

    - by Andrew Florko
    We have several legacy & 3'd-party systems in organization that use several RDBMS vendors (& more specific data storages). Cross-system data reporting (as well as extra-reports that are not implemented in 3'd-party systems) is required with charts and population of templates (winword, excel). Reporting system is visioned as intranet web-site with custom user access to reports. We expect ~50 reports per day. Would you suggest to use BizTalk or any other integration software if commercial-department doesn't plan to buy anything expensive. Would you suggest to create centralized data storage for reporting that is populated regularly or rely on on-demand services that providers always up-to-request data. Thank you in advance!

    Read the article

  • Emulating a transaction-safe SEQUENCE in MySQL

    - by Michael Pliskin
    We're using MySQL with InnoDB storage engine and transactions a lot, and we've run into a problem: we need a nice way to emulate Oracle's SEQUENCEs in MySQL. The requirements are: - concurrency support - transaction safety - max performance (meaning minimizing locks and deadlocks) We don't care if some of the values won't be used, i.e. gaps in sequence are ok. There is an easy way to archieve that by creating a separate InnoDB table with a counter, however this means it will take part in transaction and will introduce locks and waiting. I am thinking to try a MyISAM table with manual locks, any other ideas or best practices?

    Read the article

< Previous Page | 194 195 196 197 198 199 200 201 202 203 204 205  | Next Page >