Search Results

Search found 2730 results on 110 pages for 'storing'.

Page 69/110 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • Decimal rounding strategies in enterprise applications

    - by Sapphire
    Well, I am wondering about a thing with rounding decimals, and storing them in DB. Problem is like this: Let's say we have a customer and a invoice. The invoice has total price of $100.495 (due to some discount percentage which is not integer number), but it is shown as $100.50 (when rounded, just for print on invoice). It is stored in the DB with the price of $100.495, which means that when customer makes a deposit of $100.50 it will have $0.005 extra on the account. If this is rounded, it will appear as $0, but after couple of invoices it would keep accumulating, which would appear wrong (although it actually is not). What is best to do in this case. Store the value of $100.50, or leave everything as-is?

    Read the article

  • Python having problems writing/reading and testing in a correct format

    - by Ionut
    I’m trying to make a program that will do the following: check if auth_file exists if yes - read file and try to login using data from that file - if data is wrong - request new data if no - request some data and then create the file and fill it with requested data So far: import json import getpass import os import requests filename = ".auth_data" auth_file = os.path.realpath(filename) url = 'http://example.com/api' headers = {'content-type': 'application/json'} def load_auth_file(): try: f = open(auth_file, "r") auth_data = f.read() r = requests.get(url, auth=auth_data, headers=headers) if r.reason == 'OK': return auth_data else: print "Incorrect login..." req_auth() except IOError: f = file(auth_file, "w") f.write(req_auth()) f.close() def req_auth(): user = str(raw_input('Username: ')) password = getpass.getpass('Password: ') auth_data = (user, password) r = requests.get(url, auth=auth_data, headers=headers) if r.reason == 'OK': return user, password elif r.reason == "FORBIDDEN": print "Incorrect login information..." req_auth() return False I have the following problems(understanding and applying the correct way): I can't find a correct way of storing the returned data from req_auth() to auth_file in a format that can be read and used in load_auth file PS: Of course I'm a beginner in Python and I'm sure I have missed some key elements here :(

    Read the article

  • Interview question - c#

    - by ltech
    I was tasked to conduct my first interview and would like to pose my question to this world for both their feedback on my question and also on their solutions. Question: I have a legacy system with users and files, the info of all files pertaining to a user are stored on a flat file. I want to upgrade this system by storing all info on a db, design tables, and create a C# system that will populate the new db as well as ftp the files to a new path. Define the desgin consideration and develop a prototype. Note: We are looking more for what design one would use and why rather than code that compiles. If it does then kudos to you and we will give it more weight. @Tim C, I did show the interviewee the file: User1234.txt UserID=1234 ParentPath=\\somewhere\nowehere\everywhere\1234 FileCount=20 File0=something0.ext .. File19=something19.ext @Tim C, I have never conducted an interview and I followed a script given to me by my senior developer who was absent.

    Read the article

  • How to handle large table in MySQL ?

    - by Frantz Miccoli
    I've a database used to store items and properties about these items. The number of properties is extensible, thus there is a join table to store each property associated to an item value. CREATE TABLE `item_property` ( `property_id` int(11) NOT NULL, `item_id` int(11) NOT NULL, `value` double NOT NULL, PRIMARY KEY (`property_id`,`item_id`), KEY `item_id` (`item_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; This database has two goals : storing (which has first priority and has to be very quick, I would like to perform many inserts (hundreds) in few seconds), retrieving data (selects using item_id and property_id) (this is a second priority, it can be slower but not too much because this would ruin my usage of the DB). Currently this table hosts 1.6 billions entries and a simple count can take up to 2 minutes... Inserting isn't fast enough to be usable. I'm using Zend_Db to access my data and would really be happy if you don't suggest me to develop any php side part. Thanks for your advices !

    Read the article

  • Servlet 3 spec and ThreadLocal

    - by mindas
    As far as I know, Servlet 3 spec introduces asynchronous processing feature. Among other things, this will mean that the same thread can and will be reused for processing another, concurrent, HTTP request(s). This isn't revolutionary, at least for people who worked with NIO before. Anyway, this leads to another important thing: no ThreadLocal variables as a temporary storage for the request data. Because if the same thread suddenly becomes the carrier thread to a different HTTP request, request-local data will be exposed to another request. All of that is my pure speculation based on reading articles, I haven't got time to play with any Servlet 3 implementations (Tomcat 7, GlassFish 3.0.X, etc.). So, the questions: Am I correct to assume that ThreadLocal will cease to be a convenient hack to keep the request data? Has anybody played with any of Servlet 3 implementations and tried using ThreadLocals to prove the above? Apart from storing data inside HTTP Session, are there any other similar easy-to-reach hacks you could possibly advise?

    Read the article

  • Access sql server without directly permission in domain from ASP.NET

    - by Yongwei Xing
    Hi all Here is the situation. All the machines and users are in the same domain.We are in a domain enviroment. There are some sql server 2005/2008 storing data. There is a ASP.NET site in the domain using the Window Authentication. Now, we need read the data from the sql server and display them using SqlDataSource and GridView. But most of users do not have the direct permission to access the database. Is there any solution to get the data from database and display them on the site without granting users permission? Best Regards,

    Read the article

  • How can I leverage String constants in an XML file?

    - by jayshao
    I'd like to enforce standardized keys by storing them as static final String variables on a Java class, and either referencing or statically importing them, to use them as values in either XML, Strings, Methods, Annotations, etc. Does anyone know a good way to have Maven insert (like filtering) values like StringKeys.SOME_KEY into an XML file? e.g. something like <element value="${StringKeys.SOME_KEY}"/> or similar - the main idea is to enforce commonality and prevent key mis-alignment. Or an alternative solution to accomplish the same - with some semantic that if a non-existant String is referenced, that it fails during build? Bonus points if it works in C# as well.

    Read the article

  • TFS Automated Builds to Code Packages

    - by Adam Jenkin
    I would like to hear the best practices or know how people perform the following task in TFS 2008. I am intending on using TFS for building and storing web applications projects. Sometimes these projects can contain 100's of files (*.cs, *.acsx etc) During the lifetime of the website, a small bug will get raised resulting in say a stylesheet change, and a change to default.aspx.cs for example. On checking in these changes to TFS, and automated build would be triggered (great!), however for deploying the changes to the target production machine, I only need to deploy for example: style.css default.asx MyWebApplications.dll So my question is, can MSBuild be customized to generate a "code pack" of only the files which require deploying to the production server based on the changeset which cause the re-build?

    Read the article

  • Best way to store chat messages and files

    - by Stnaire
    I would like to know what do you think about storing chat messages in a database? I need to be able to bind other stuff to them (like files, or contacts) and using a database is the best way I see for now. The same question comes for files, because they can be bound to chat messages, I have to store them in the database too.. With thousands of messages and files I wonder about performance drops and database size. What do you think considering I'm using PHP with MySQL/Doctrine?

    Read the article

  • does it make sense to send password information during email communication from websites

    - by Samuel
    Most of the online sites on registration do send a link to activate the site and on any further correspondence with the end user they provide information about the site and also provide the login credentials with password in clear text (as given below) Username - [email protected] Password - mysecretpassword What would you do in such a case? From a usability perspective does it make sense to send the password information in clear text or should you just avoid sending this information. I was under the impression that most of the passwords are MD5 hashed before storing in the database and hence the service provider will not have any access to clear text passwords, is this a security violation?

    Read the article

  • How to effectively clip the amount of entries in a dictionary?

    - by reinier
    I had a List<myClass> myList for storing a list of items. When I had to clip this (discard any amount of items above some threshold) I used: myList.RemoveRange(threshold, myList.Count - threshold); where threshold is the max amount of things the list can contain Now I've upgraded the datatype to a Dictionary<key, myClass> myDictionary How can I basically do the same: Discard all entries above some threshold. (It doesn't matter which ones are discarded) I guess I could foreach through the keys collection and manually delete all keys/value pairs. But I was hoping there was a more elegant solution.

    Read the article

  • wordpress mu image speed problem

    - by InnateDev
    I have an mu install with the typical blogs.dir folder storing files for each blog. When loading these images however they take forever to appear, but they eventually do. It seems that wpmu uses php to serve each image which is ludicrous. When using images from the same domain but in a root folder, the images are displayed quickly. Is there a workaround the blogs.php for rendering files? Could there be something else wrong in the settings of my install?

    Read the article

  • Treating a fat webservice in .net 3.5 c#

    - by Chris M
    I'm dealing with an obese 3rd party webservice that returns about 3mb of data for a simple search results, about 50% of the data in that response is junk. Would it make sense then to remap this data to my own result object and ditch the response so I'm storing 1-2 mb in memory for filtering and sorting rather than using the web-responses own object and using 2-4 or am I missing a point? So far I've been accessing the webservice from a separate project and using a new class to provide the interaction and to handle the persistence so my project looks like this |- Web (mvc2 proj) |- DAL (database/storage fluent-nhibernate) |- SVCGateway (interaction layer + webservice related models) |- Services -------------- |- Tests |- Specs I'm trying to make the application behave fast and I also need to store the result set temporarily in case a customer goes to view the product and wants to go back to the results. (Service returns only 500 of possible 14K results). So basically I'm looking for confirmation that I'm doing the right thing in pushing the results into my own objects or if I'm breaking some rule or even if there's a better way of handling it. Thanks

    Read the article

  • Unread email notifier, most practical approach

    - by Michael Pasqualone
    I'm in the process of writing a small php-cli script that will loop over over my personal inbox and then send me an SMS via a gateway. The question I have is: As will have the script launch via cron every 10 minutes, if there is an email sitting in my inbox that is not read before the next script launch then I will receive 2 sms. Does any one (pseudocode will do) have any idea what the best practice would be in php5 to ensure only 1 SMS is sent? What I am currently learning towards is towards storing the message ID in a sqlite DB and flagging a field whether an SMS has been sent or not - but wondering if there is an easier way?

    Read the article

  • Optimal directory structure for filesystem

    - by Pankaj
    We have large scale web application which has millions of customer. Each customer can have document based on document type. We may have 20-30 types of documents. We are planning to use GlusterFS for storing these documents. I'm trying to find out what are the limitations of Gluster as far as number of files/directories ? Do we need to have hierarchical directory structure ? What would be the optimal directory structure ? Does this make sense - CustmerId Documenttype File1 File2

    Read the article

  • Is it really wrong to version documents using CouchDB's behaviour?

    - by Tomas Sedovic
    This is one of those "I know I shouldn't do this but it's oh so convenient." questions. Sorry about that. I plan to use CouchDB for storing a bunch of documents and keeping their entire revision history. CouchDB does the versioning automatically, but it is strongly discouraged for programmer's use: "You cannot rely on document revisions for any other purpose than concurrency control." From what I've found on the CouchDB wiki, the versions can get deleted either during compaction or during replication. As far as I can tell, Compaction must always be triggered manually and Replication occurs only when there's more than one database server. The question is: if I won't run compaction and will use only single database instance for my documents, can I just use CouchDB's document versioning and expect it to work? What other problems I might run into? E.g. does not running compaction hurt the performance or consume significantly more disk space (than if I did handle the versioning manually)?

    Read the article

  • Regular expression match, extracting only wanted segments of string

    - by Ben Carey
    I am trying to extract three segments from a string. As I am not particularly good with regular expressions, I think what I have done could probably be done better... I would like to extract the bold parts of the following string: SOMETEXT: ANYTHING_HERE (Old=ANYTHING_HERE, New=ANYTHING_HERE) Some examples could be: ABC: Some_Field (Old=,New=123) ABC: Some_Field (Old=ABCde,New=1234) ABC: Some_Field (Old=Hello World,New=Bye Bye World) So the above would return the following matches: $matches[0] = 'Some_Field'; $matches[1] = ''; $matches[2] = '123'; So far I have the following code: preg_match_all('/^([a-z]*\:(\s?)+)(.+)(\s?)+\(old=(.+)\,(\s?)+new=(.+)\)/i',$string,$matches); The issue with the above is that it returns a match for each separate segment of the string. I do not know how to ensure the string is the correct format using a regular expression without catching and storing the match if that makes sense? So, my question, if not already clear, how I can retrieve just the segments that I want from the above string?

    Read the article

  • Python and database

    - by axl456
    hello.. Am working on a personal project, where i need to manipulate values in a database-like format.. Up until now, am using dictionaries, tuples, and list to store and consult those values. Am thinking about starting to use SQL to manipulate those values, but I dont know if its worth the effort, because I dont know anything about SQL, and I dont want to use something that wont bring me any benefits (if I can do it in a simpler way, i dont want to complicate things) if am only storing and consulting values, what would be the benefit of using SQL? PS: the numbers of row goes between 3 and 100 and the number of columns is around 10 (some may have 5 some may have 10 etc)

    Read the article

  • Amazon S3 as secure backup without multiple invoices

    - by Tom Viner
    I'm storing copies of database backups on Amazon S3 using the Python Boto library. But I worry that if my web server was hacked, those backups could be deleted using the credentials I need to do the upload. Ok, so I know you can grant permissions to another Amazon email address, so I can imagine doing that after an upload then removing the original user's write access BUT in this scenario I now end up with 2 accounts and 2 sets of invoices to give to accounts every month. Is there a solution to this that doesn't require a new Amazon account for each web server I run?

    Read the article

  • What data structure would be the least painful DataTable replacement?

    - by MatthewMartin
    I'm storing a lot of sorted ~10 row 2 column/key value pairs in ASP.NET cache-- they're the data for dropdownlists. Right now they are all DataTables, which isn't very space efficient (the rule of thumb is 10x increase in size when data is strored in a dataset). Old Code DataTable table = dataAccess.GetDataTable(); dropDownList.DataSource = table; Hoped for new Code Unknown data = dataAccess.GetSomethingMoreSpaceEfficient(); dropDownList.DataSource = data; What pre-existing datastructures are similar enough to DataTable that would minimize code breakage and reduce the serialized size when stored in ASP.NET cache?

    Read the article

  • UTC-8 conversion

    - by leachianus
    Hey guys, I am grabbing a JSON array and storing it in a NSArray, however it includes JSON encoded UTF-8 strings, for example pass\u00e9 represents passé. I need a way of converting all of these different types of strings into the actual character. I have an entire NSArray to convert. Or I can convert it when it is being displayed, which ever is easiest. I found this chart http://tntluoma.com/sidebars/codes/ is there a convenience method for this or a library I can download? thanks, BTW, there is no way I can find to change the server so I can only fix it on my end...

    Read the article

  • Best place to store large amounts of session data

    - by audiopleb
    I'm building an application that needs to store and re-use large amounts of data per session. So for example, the user selects a large list of list items (say 2000 or significantly more) which have a numeric value as their key then they save that selection and go off to another page, do something else and then come back to the original page and need to load their selections into that page. What is the quickest and most efficient way of storing and reusing that data? In a text file saved with the session id? In a temp db table? In the session data itself (db sessions so size isn't a limit) using a serialised string or using gzcompress or gzencode? Any advice or insight would be great! Thank you!!!!

    Read the article

  • Determining the number of objects in an NSArray

    - by Viral
    Hi friends, I am making a book application. To move to the next topic I am using a button. The Button works as it moves to the next topic, but at the end of the file my application gets the message obj_fatal and it crashes. If I knew how many objects there are in my NSArray then the problem will be solved. I am getting the details from a .plist file and storing it in to a array. So if any one knows how, please let me know. Thanks in advance. Viral.

    Read the article

  • data structure for counting frequencies in a database table-like format

    - by user373312
    i was wondering if there is a data structure optimized to count frequencies against data that is stored in a database table-like format. for example, the data comes in a (comma) delimited format below. col1, col2, col3 x, a, green x, b, blue ... y, c, green now i simply want to count the frequency of col1=x or col1=x and col2=green. i have been storing the data in a database table, but in my profiling and from empirical observation, database connection is the bottle-neck. i have tried using in-memory database solutions too, and that works quite well; the only problem is memory requirements and quirky init/destroy calls. also, i work mainly with java, but have experience with .net, and was wondering if there was any api to work with "tabular" data in a linq way using java. any help is appreciated.

    Read the article

  • How do I hook all http get requests and distinguish downloaded (by standard download manager) files

    - by Ivan
    I want to write a Firefox add-on for advanced history tracking and bookmarking which will send URLs the browser meets during usage (and all the metadata available about the context) to a web service which will keep track of them storing in an SQL database for further access and analysis. I'd like to divide URLs tracked into 5 groups: those I explicitly click to bookmark, those I download by Firefox standard built-in download manager, all other URLs accessed, all URLs met on all viewed pages as hrefs, all other URLs mentioned in HTML sources of all viewed pages. Any ideas of how to get those in an extension?

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >