Search Results

Search found 71826 results on 2874 pages for 'master data services'.

Page 105/2874 | < Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >

  • MQ EOL Data conversion

    - by lemotdit
    we are sending data trough MQ from a z/OS/CICS system to an AS400. Original encoding of the message is CCSID 500 with a MQSTR Format. The client application is getting the message with the CONVERT option and CCSID 819. Data is almost converted correctly except for the end of line caracter. Any idea? The z/OS is sending 0D (CR) as end of line caracter. If they had 0D+0A (CR+LF), CCSID automatically change from 500 to 437, and the end of line still ain't right on the client side.

    Read the article

  • Efficient data structure for fast random access, search, insertion and deletion

    - by Leonel
    I'm looking for a data structure (or structures) that would allow me keep me an ordered list of integers, no duplicates, with indexes and values in the same range. I need four main operations to be efficient, in rough order of importance: taking the value from a given index finding the index of a given value inserting a value at a given index deleting a value at a given index Using an array I have 1 at O(1), but 2 is O(N) and insertion and deletions are expensive (O(N) as well, I believe). A Linked List has O(1) insertion and deletion (once you have the node), but 1 and 2 are O(N) thus negating the gains. I tried keeping two arrays a[index]=value and b[value]=index, which turn 1 and 2 into O(1) but turn 3 and 4 into even more costly operations. Is there a data structure better suited for this?

    Read the article

  • How do I do automatic data serialization of data objects in Haskell

    - by Adam Gent
    One of the huge benefits in languages that have some sort of reflection/introspecition is that objects can be automatically constructed from a variety of sources. For example in Java I can use the same objects for persisting to a db (with Hibernate) serializing to XML (with JAXB) or serializing to JSON (json-lib). You can do the same in Ruby and Python also usually following some simple rules for properties or annotations for Java. Thus I don't need lots "Domain Transfer Objects". I can concentrate on the domain I am working in. It seems in very strict FP like Haskell and Ocaml this is not possible. Particularly Haskell. The only thing I have seen is doing some sort of preprocessing or meta-programming (ocaml). Is it just accepted that you have to do all the transformations from the bottom upwards? In other words you have to do lot of boring work to turn a data type in haskell into JSON/XML/DB Row object and back again into a data object.

    Read the article

  • What is the best possible technology for pulling huge data from 4 remote servers

    - by Habib Ullah Bahar
    Hello, For one of our project, we need to pull huge real time stock data from 4 remote servers across two countries. The trivial process here, check the sources for a regular interval and save the update to database. But as these are real time stock data of more than 1000 companies, I have to pull every second, which isn't good in case of memory, bandwidth I think. Please give me suggestion on which technology/platform [We are flexible here. PHP, Python, Java, PERL - anyone of them will be OK for us] we should choose, it can be achieved easily and with better performance.

    Read the article

  • Best data-structure to use for two ended sorted list

    - by fmark
    I need a collection data-structure that can do the following: Be sorted Allow me to quickly pop values off the front and back of the list Remain sorted after I insert a new value Allow a user-specified comparison function, as I will be storing tuples and want to sort on a particular value Thread-safety is not required Optionally allow efficient haskey() lookups (I'm happy to maintain a separate hash-table for this though) My thoughts at this stage are that I need a priority queue and a hash table, although I don't know if I can quickly pop values off both ends of a priority queue. I'm interested in performance for a moderate number of items (I would estimate less than 200,000). Another possibility is simply maintaining an OrderedDictionary and doing an insertion sort it every-time I add more data to it. Furthermore, are there any particular implementations in Python. I would really like to avoid writing this code myself.

    Read the article

  • Core Data and Relationships

    - by alku83
    I have two objects, a Trip and a Place. A Trip represents a journey from one Place to another Place, ie. a Trip needs a fromPlace and a toPlace. So, this is a 1-to-2 relationship, but I need to know which is the "from" and which is the "to". I am not sure how to model this in Core Data. I have created two entities (Trip, Place), and now I want to setup the relationship(s) so I have a fromPlace and a toPlace. Do I need to add an extra field on the Place entity called isFrom, or similar? If this was in a database, I would just have a id column on the Place table, and then two columns in the Trip table - fromPlaceId and toPlaceId. How do I achieve something similar in Core Data?

    Read the article

  • How can I compare Core Data models?

    - by Don
    I noticed while doing system testing that a feature of our app had been removed. It looks like at some point, an older version of a file was checked into SVN that was missing a property. This specific file was generated from the Core Data model, and sure enough, the latest version of the model in SVN is missing the same attribute. I need to find out if any other attributes are missing, or if anything else in the model changed. However, the elements file in the .xcodedatamodel folder appears to binary and I can't compare the revisions. Is there a way to find the differences between two Core Data models in SVN? Barring that, what would be the best way to accomplish this task?

    Read the article

  • Storing a bucket of numbers in an efficient data structure

    - by BlitzKrieg
    I have a buckets of numbers e.g. - 1 to 4, 5 to 15, 16 to 21, 22 to 34,.... I have roughly 600,000 such buckets. The range of numbers that fall in each of the bucket varies. I need to store these buckets in a suitable data structure so that the lookups for a number is as fast as possible. So my question is what is the suitable data structure and a sorting mechanism for this type of problem. Thanks in advance

    Read the article

  • Fixing Master Boot Record From Linux After Windows Corrupts It

    - by Chris S
    I bought a Dell Inspiron 1545 laptop, and setup Ubuntu 10.04 to dual boot alongside Windows 7. Regrettably, when activating the wireless, I did something in Windows 7 that caused it to overwrite or corrupt the master boot record, so now when I reboot I get the message "No bootable devices found". Is there an easy way to fix this, perhaps via a Linux LiveCD, or do I have to wipe out everything and reinstall?

    Read the article

  • Data Structure for a particular problem??

    - by AGeek
    Hi, Which data structure can perform insertion, deletion and searching operation in O(1) time in the worst case. We may assume the set of elements are integers drawn from a finite set 1,2,...,n, and initialization can take O(n) time. I can only think of implementing a hash table. Implementing it with Trees will not give O(1) time complexity for any of the operation. Or is it possible?? Kindly share your views on this, or any other data structure apart from these.. Thanks..

    Read the article

  • Queue-like data structure with fast search and insertion

    - by Max
    I need a datastructure with the following properties: It contains integer numbers, no duplicates. After it reaches the maximal size the first element is removed. So if the capacity is 3, then this is how it would look when putting in it sequential numbers: {}, {1}, {1, 2}, {1, 2, 3}, {2, 3, 4}, {3, 4, 5} etc. Only two operations are needed: inserting a number into this container (INSERT) and checking if the number is already in the container (EXISTS). The number of EXISTS operations is expected to be approximately 2 * number of INSERT operations. I need these operations to be as fast as possible. What would be the fastest data structure or combination of data structures for this scenario?

    Read the article

  • Transferring Data Between Server and Client (Mobile)

    - by Byron
    Scenario: Client (Mobile) - .Net CF 2.0, SQL CE 3.0 Server - .Net 2.0, SQL Server 2005, Web Service Client and Server database schemas differ. From server - only certain columns from certain tables need to be synced. From client - everything will need to be synced once client has made changes. Client will continually poll a web service to download and upload data. A framework will be developed to package and unpackage data, used by both client and server. How would you develop the packaging and unpackaging? Use datasets, serialise strongly typed objects? All suggestions welcome. Thanks

    Read the article

  • Adding single row one by one in tableview from core data of iPhone

    - by user336685
    I am working on RSS reader code where articles get downloaded and are viewable offline. The problem is only after all articles are downlaoded the tableview containing headlines gets updated. Core data is used. So everytime NSobjectcontext is saved , [self tableview updatebegins ] is called.The table is getting updated via fetchcontroller core data. I tried saving NSobjectcontext everytime an article is saved but that is not updating the tableview. I want a mechanism similar to instapaper tableview where articles get saved and tableview gets updated immediately. Please help if you know the solution. Thanks in advance.

    Read the article

  • Core Data and BOOL setup

    - by John Valen
    I am working on an app that uses Core Data as its backend for managing SQLite records. I have everything working with strings and numbers, but have just tried adding BOOL fields and can't seem to get things to work. In the .xcdatamodel, I have added a field to my object called isCurrentlyForSale which is not Optional, not Transient, and not Indexed. The attribute's type is set to Boolean with default value NO. When I created the class files from the data model, the boilerplate code added for this property in the .h header was: @property (nonatomic, retain) NSNumber * isCurrentlyForSale; along with the @dynamic isCurrentlyForSale; in the .m implementation file. I've always worked with booleans as simple BOOLs. I've read that I could use NSNumber's numberWithBool and boolValue methods, but this seems like an aweful lot of extra code for something so simple. Can the @property in the header be changed to a simple BOOL? If so is there anything to watch out for? Thanks -John

    Read the article

  • AutoComplete implementation - Interview Question

    - by user181218
    Hi, Say you have a DB table with two cols: SearchPhrase(String) | Popularity(Int). You need to initialize a DS so that you could use it to implement an autocomplete feature (like google suggest) comfortably. The requirement: Once the data from the db is processed into the data structure, when you type a letter you get the 10 most popular searchphrases from the db starting with that letter,then when you type the next one you get the 10 .... with these two letters and so on. The question only concerns planning the ds and pseudocoding Insert,Search etc. Note: YOU CANNOT USE TRIE DS. Any ideas?

    Read the article

  • Why would one use REST instead of Web services?

    - by AngryHacker
    Attended an interesting demo on REST today, however, I couldn't think of a single reason (nor was one presented) why REST is in anyway better or simpler to use and implement than a Web Services stack. What are some of the reasons Why anyone in the "real world" use REST instead of the Web Services?

    Read the article

  • Returning data from SQL Server reporting web service call

    - by user79339
    Hi, I am generating a report that contains the version number. The version number is stored in the DB and retrieved/incremented as part of the report generation. The only problem is, I am calling SSRS via a web service call, which returns the generated report as a byte array. Is there any way to get the version number out of this generated report? For example to display a dialog that says "You generated Status Report, Version number 3". I tried passing in an output parameter and setting it inside the storedproc. Its modified when i execute it in sql management studio, but not in the reporting studio. Or atleast i can't seem to bind to the modified, post execution value (using expression "=Parameters!ReportVersion.Value"). Of course, I could get/increment the version number from database myself before calling the SSRS webservice and pass it along as a parameter to the Report, but that might cause concurrancy problems. On the whole, it just seems neater for the storedproc to access/generate a version number and return it to the ReportingEngine, which will generate the report with the version number and return the updated version number to the WebService client. Any thoughts/Ideas?

    Read the article

  • Best way to collect and store data daily?

    - by mktb
    I have a bunch of statistics: # of users, # of families, ratio user/family, etc. I'd like to store these daily so I can view this data historically. However, I'm looking for the most effective way to store this data. Should I run a cron job that writes to the database DATE: today USERS: 123 FAMILIES: 456 RATIO: 7.89 or whatever? (or should I write multiple rows like DATE: today DATATYPE: users VALUE: 123?) Or is there another option I can use that is more efficient or more effective? Thanks!

    Read the article

  • iOS 5 - Coredata Sqlite DB losing data after killing app

    - by Brian Boyle
    I'm using coredata with a sqlite DB to persist data in my app. However, each time I kill my app I lose any data that was saved in the DB. I'm pretty sure its because the .sqlite file for my DB is just being replaced by a fresh one each time my app starts, but I can't seem to find any code that will just use the existing one thats there. It would be great if anyone could point me towards some code that could handle this for me. Cheers B - (NSPersistentStoreCoordinator *)persistentStoreCoordinator { if (__persistentStoreCoordinator != nil) { return __persistentStoreCoordinator; } NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, [NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil]; NSURL *storeURL = [[self applicationDocumentsDirectory] URLByAppendingPathComponent:@"FlickrCoreData.sqlite"]; NSError *error = nil; __persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]]; if (![__persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeURL options:options error:&error]) { NSLog(@"Unresolved error %@, %@", error, [error userInfo]); abort(); } return __persistentStoreCoordinator; }

    Read the article

  • Data Warehouse: One Database or many?

    - by drrollins
    At my new company, they keep all data associated with the data warehouse, including import, staging, audit, dimension and fact tables, together in the same physical database. I've been a database developer for a number of years now and this consolidation of function and form seems counter to everything I know. It seems to make security, backup/restore and performance management issues more manually intensive. Is this something that is done in the industry? Are there substantial reasons for doing or not doing it? The platform is Netezza. The size is in terabytes, hundreds of millions of rows. What I'm looking to get from answers to this question is a solid understanding of how right or wrong this path is. From your experience, what are the issues I should be focused on arguing if this is a path that will cause trouble for us down the road. If it is no big deal, then I'd like to know that as well.

    Read the article

  • JSON DATA formatting in WebAPI

    - by user1736299
    public class CalendarController : ApiController { Events[] events = new Events[] { new Events { title= "event1", start = System.DateTime.UtcNow, end = System.DateTime.UtcNow }, new Events { title= "event2", start = System.DateTime.UtcNow, end = System.DateTime.UtcNow }, new Events { title= "event3", start = System.DateTime.UtcNow, end = System.DateTime.UtcNow} }; public IEnumerable<Events> GetAllCalendar() { return events; } The JSON result for the above is [{ "title": "event1", "start": "2012-12-05T22:52:35.6471712Z", "end": "2012-12-05T22:52:35.6471712Z"}, { "title": "event2", "start": "2012-12-05T22:52:35.6471712Z", "end": "2012-12-05T22:52:35.6471712Z"}, { "title": "event3", "start": "2012-12-05T22:52:35.6471712Z", "end": "2012-12-05T22:52:35.6471712Z" }]? How to create the same JSON result without the double quotes but single quote. How to get the date in the format of ‘YYYY-MM-DD HH:MM:SS’ Thank you, Smith

    Read the article

  • iPhone app launch times and Core Data migration

    - by sehugg
    I have a Core Data application which I plan to update with a new schema. The lightweight migration seems to work, but it takes time proportional to the amount of data in the database. This occurs in the didFinishLaunchingWithOptions phase of the app. I want to avoid <app> failed to launch in time problems, so I assume I cannot keep the migration in the didFinishLaunchingWithOptions method. I assume the best method would involve performing the migration in a background thread. I assume also that I'd need to defer loading of the main ViewController until the loading completes to avoid using the managedObjectContext until initialization completes. Does this make sense, and is there example code (maybe in Apple sample projects) of this sort of initialization?

    Read the article

< Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >