Search Results

Search found 61393 results on 2456 pages for 'data integration'.

Page 52/2456 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • EXC_MEMORY_ACCESS when trying to delete from Core Data ($cash solution)

    - by llloydxmas
    I have an application that downloads an xml file, parses the file, and creates core data objects while doing so. In the parse code I have a function called 'emptydatacontext' that removes all items from Core Data before creating replacements items from the xml data. This method looks like this: -(void) emptyDataContext { NSFetchRequest * allCon = [[NSFetchRequest alloc] init]; [allCon setEntity:[NSEntityDescription entityForName:@"Condition" inManagedObjectContext:managedObjectContext]]; NSError * error = nil; NSArray * conditions = [managedObjectContext executeFetchRequest:allCon error:&error]; DebugLog(@"ERROR: %@",error); DebugLog(@"RETRIEVED: %@", conditions); [allCon release]; for (NSManagedObject * condition in conditions) { [managedObjectContext deleteObject:condition]; } // Update the data model effectivly removing the objects we removed above. //NSError *error; if (![managedObjectContext save:&error]) { DebugLog(@"%@", [error domain]); } } The first time this runs it deletes all objects and functions as it should - creating new objects from the xml file. I created a 'update' button that starts the exact same process of retrieving the file the proceeding with the parse & build. All is well until its time to delete the core data objects. This 'deleteObject' call creates a "EXC_BAD_ACCESS" error each time. This only happens on the second time through. Captured errors return null. If I log the 'conditions' array I get a list of NSManagedObjects on the first run. On the second this log request causes a crash exactly as the deleteObject call does. I have a feeling it is something very simple I'm missing or not doing correctly to cause this behavior. The data works great on my tableviews - its only when trying to update I get the crashes. I have spent days & days on this trying numerous alternative methods. Whats left of my hair is falling out. I'd be willing to ante up some cash for anyone willing to look at my code and see what I'm doing wrong. Just need to get past this hurdle. Thanks in advance for the help!

    Read the article

  • relating data stored in NoSQL DB to data stored in SQL DB

    - by seanbrant
    Whats the best way to use a SQL DB along side a NoSQL DB? I want to keep my users and other data in postgres but have some data that would be better suited for a NoSQL DB like redis. I see a lot of talk about switching to NoSQL but little talk on integrating it with existing systems. I think it would be foolish to throw the baby out with the bath water and ditch SQL all together, unless it makes things easier to maintain and develop. I'm wondering what the best approach is for relating data stored in SQL to my data in redis. I was thinking of something along the line of this. User object stored in SQL Book object in redis, key sh1 hash of value, value is a JSON string Relations stored in redis, key User.pk:books, value redis set of sha1's Anyone have experience, tips, better ways?

    Read the article

  • Haskell data serialization of some data implementing a common type class

    - by Evan
    Let's start with the following data A = A String deriving Show data B = B String deriving Show class X a where spooge :: a -> Q [ Some implementations of X for A and B ] Now let's say we have custom implementations of show and read, named show' and read' respectively which utilize Show as a serialization mechanism. I want show' and read' to have types show' :: X a => a -> String read' :: X a => String -> a So I can do things like f :: String -> [Q] f d = map (\x -> spooge $ read' x) d Where data could have been [show' (A "foo"), show' (B "bar")] In summary, I wanna serialize stuff of various types which share a common typeclass so I can call their separate implementations on the deserialized stuff automatically. Now, I realize you could write some template haskell which would generate a wrapper type, like data XWrap = AWrap A | BWrap B deriving (Show) and serialize the wrapped type which would guarantee that the type info would be stored with it, and that we'd be able to get ourselves back at least an XWrap... but is there a better way using haskell ninja-ery? EDIT Okay I need to be more application specific. This is an API. Users will define their As, and Bs and fs as they see fit. I don't ever want them hacking through the rest of the code updating their XWraps, or switches or anything. The most i'm willing to compromise is one list somewhere of all the A, B, etc. in some format. Why? Here's the application. A is "Download a file from an FTP server." B is "convert from flac to mp3". A contains username, password, port, etc. information. B contains file path information. A and B are Xs, and Xs shall be called "Tickets." Q is IO (). Spooge is runTicket. I want to read the tickets off into their relevant data types and then write generic code that will runTicket on the stuff read' from the stuff on disk. At some point I have to jam type information into the serialized data.

    Read the article

  • Cloud Apps and Single Sign-On (AD integration)

    - by Pablo Alvim
    I've been investigating some cloud vendors and the ability to implement single sign-on with them, especially when it comes to AD (Active Directory) integration. So far I've learned that with Azure this is possible through ADFS and the AppFabric Access Control offer. In AWS, since it is possible to create a VPN and see EC2 instances as a natural extension of a private datacenter, I believe implementing SSO would be rather simple (not sure if I'm right on this one... Please correct me if I'm wrong). With App Engine though, even though there is some documentation on AD synchronization (not full integration) for Google Apps, I'm struggling to find out whether AD integration would be possible... Is there any strategy for that? Any bit of information on cloud apps and AD integration will be appreciated!

    Read the article

  • Looping through array in PHP to post several multipart form-data

    - by Léon Pelletier
    I'm trying in an asp web application to code a function that would loop through a list of files in a multiple upload form and send them one by one. Is this something that can be done in ASP? Because I've read some posts about how to attach several files together, but saw nothing about looping through the files. I can easily imagine it in C# via HttpWebRequest or with socket, but in php, I guess there are already function designed to handle it? // This is false/pseudo-code :) for (int index = 0; index < number_of_files; index++) { postfile(file[index]); } And in each iteration, it should send a multipart form-data POST. postfile(TheFileInfos) should make a POST like it: POST /afs.aspx?fn=upload HTTP/1.1 [Header stuff] Content-Type: multipart/form-data; boundary=----------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 [Header stuff] ------------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 Content-Disposition: form-data; name="Filename" myimage1.png ------------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 Content-Disposition: form-data; name="fileid" 58e21ede4ead43a5201206101806420000007667212251 ------------Ef1Ef1cH2Ij5GI3ae0gL6KM7GI3GI3 Content-Disposition: form-data; name="Filedata"; filename="myimage1.png" Content-Type: application/octet-stream [Octet Stream] [Edit] I'll try it: <html> <head> <title>Untitled Document</title> <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1"> </head> <body> <form name="form1" enctype="multipart/form-data" method="post" action="processFiles.php"> <p> <? // start of dynamic form $uploadNeed = $_POST['uploadNeed']; for($x=0;$x<$uploadNeed;$x++){ ?> <input name="uploadFile<? echo $x;?>" type="file" id="uploadFile<? echo $x;?>"> </p> <? // end of for loop } ?> <p><input name="uploadNeed" type="hidden" value="<? echo $uploadNeed;?>"> <input type="submit" name="Submit" value="Submit"> </p> </form> </body> </html>

    Read the article

  • Storing data locally and synchronizing it with data base on linux server

    - by Miraaj
    Hi all, I have developed a mac application, which is continuously interacting with database on linux server. As data is increasing it has become costlier affair in terms of time to fetch data from server. So I am planning to store required data locally, say on mysqlite and find some mechanism through which I can synchronize it with database on linux server. Can anyone suggest me some way to accomplish it? Thanks, Miraaj

    Read the article

  • data to be saved in SQLite and want to retrive data from there

    - by Mona
    In android, is it possible that we insert our database in SQLite and get back that data on our EditText boxes. I want to get data from database and populate it in my application activity. How can i do that. I want to save, update and delete my database in SQLite and most important i want to get data from database that is saved in SQLite tables. How can it be possible. Kindly guide. I will be thankfull to you

    Read the article

  • Ways to store data in .NET

    - by Audel
    I am looking for ways to store data in a windows form application in .NET. I want to make the input data of a system persistent, so when I close my program and open it again, the data is retrieved. Which ways are of doing this besides creating a linked database? Examples are gladly appreciated regards

    Read the article

  • Transforming OLTP Relational Database to Data Warehousing Model

    - by Russ Cam
    What are the common design approaches taken in loading data from a typical Entity-Relationship OLTP database model into a Kimball star schema Data Warehouse/Marts model? Do you use a staging area to perform the transformation and then load into the warehouse? How do you link data between the warehouse and the OLTP database? Where/How do you manage the transformation process - in the database as sprocs, dts/ssis packages, or SQL from application code?

    Read the article

  • Looking for a list of free data api's and web services

    - by darren
    I'm wondering if anybody has come across a comprehensive list of free sources for data (as a web api) or web services. I'm looking to start a new project to tinker with in my spare time and I am wondering what interesting data is available to play with. It seems like many api services such as last.fm or google search don't exist or are no longer free. Possible examples of what I am looking for information about a given ip address mapping api's information about books, movies, music information about places, businesses, attractions meteorological, financial or other scientific data shopping, products I would appreciate any suggestions you may have about interesting data freely available through the web. thanks

    Read the article

  • Data Usage Checker Tools

    - by Lucifer
    Hey All, I am about to begin a project for a new client, and am worried about a few things concerning data usage on their internet plan. We're in an area where most of the major networks don't cover the area, and the ones that do, have very expensive plans, with very low data allowance per month. I need to develop an app, but part of the problem lies with checking database values every 30 seconds. It's pretty important that this check is happening every 30 seconds, as the database is actually updated all day everyday, approx. every 5seconds (apparently). Each row in the database consists of about a page full of text if you were to paste it into MS Word. So, are there any logical ways of minimizing data usage in my case, and also how am I able to see exactly how much data is used just to establish a connection to the database? Are there any tools for this kind of info? Thanks :)

    Read the article

  • How to store data using core data in iphone?

    - by Warrior
    I am new to iphone development.I want to show a form a and store the contents in to a core data database after clicking the submit button.I have created a form.xcdatamodel and class events.h and events.m with reference to the apple docs.In some Sample codes the values are stored statically in the delegate class and they use core data delegate methods. But in my case the form view come after passing 2 views. I want to store the data entered here .How can i achieve it.Please help me out.Thanks.

    Read the article

  • data ownership and performance

    - by Ami
    We're designing a new application and we ran into some architectural question when thinking about data ownership. we broke down the system into components, for example Customer and Order. each of this component/module is responsible for a specific business domain, i.e. Customer deals with CRUD of customers and business process centered around customers (Register a n new customer, block customer account, etc.). each module is the owner of a set of database tables, and only that module may access them. if another module needs data that is owned by another module, it retrieves it by requesting it from that module. so far so good, the question is how to deal with scenarios such as a report that needs to show all the customers and for each customer all his orders? in such a case we need to get all the customers from the Customer module, iterate over them and for each one get all the data from the Order module. performance won't be good...obviously it would be much better to have a stored proc join customers table and orders table, but that would also mean direct access to the data that is owned by another module, creating coupling and dependencies that we wish to avoid. this is a simplified example, we're dealing with an enterprise application with a lot of business entities and relationships, and my goal is to keep it clean and as loosely coupled as possible. I foresee in the future many changes to the data scheme, and possibly splitting the system into several completely separate systems. I wish to have a design that would allow this to be done in a relatively easy way. Thanks!

    Read the article

  • Retrieving Data From formData in Rails jquery-file-upload

    - by CanCeylan
    I am trying to add additional form data by using https://github.com/blueimp/jQuery-File-Upload/wiki/How-to-submit-additional-form-data this tutorial for jQuery-File-Upload plugin in my Rails app. I'm following the instructions for Setting formData on upload start for each individual file upload. My problem is, after saving the files with their titles as explained in tutorial, I cannot show them in the final table, because I don't know how to reach the formData's values. How should I reach the data inside data.formData = inputs.serializeArray(); and post them next to each item ? Thanks.

    Read the article

  • Accessing data feed

    - by racket99
    I have a Java program on my desktop which displays financial data gleaned from the web. It is a 3rd party application. What I would like to do is intercept the data before it goes to the Java application and record it into a flat file for the purpose of later data analysis. Is this at all possible? I imagine the data are available and are entering my computer through some port which the Java app picks up and then displays. Help appreciated. Thanks

    Read the article

  • Nested/Sub data types in haskell

    - by Tom Carstens
    So what would be nice is if you could do something like the following (not necessarily with this format, just the general idea): data Minor = MinorA | MinorB data Major = Minor | MajorB isMinor :: Major -> Bool isMinor Minor = True isMinor _ = False So isMinor MinorA would report True (instead of an error.) At the moment you might do something like: data Major = MinorA | MinorB | MajorB isMinor :: Major -> Bool isMinor MinorA = True isMinor MinorB = True isMinor _ = False It's not terrible or anything, but it doesn't expand nicely (as in if Minor when up to MinorZ this would be terribly clunky). To avoid that problem you can wrap Minor: data Minor = MinorA | MinorB data Major = MajorA Minor | MajorB isMinor :: Major -> Bool isMinor (MajorA _) = True isMinor _ = False But now you have to make sure to wrap your Minors to use them as a Major... again not terrible; just doesn't really express the semantics I'd like very well (i.e. Major can be any Minor or MajorB). The first (legal) example is "Major can be MinorA..." but doesn't have any knowledge of Minor and the second is "Major can be MajorA that takes a Minor..." p.s. No, this isn't really about anything concrete.

    Read the article

  • MySQL ADO.NET Connector & MSSQL Integration Services

    - by user1114330
    Here I am, day three... attempting to sync a data view on a Windows Vista box (64 bit) running MSSQL 2012 and Visual Studio 2010. Sanity is slipping and hunger for progress fills my attention. I went through hell trying to get the MySQL ODBC drivers to get the job but to no avail...everyone seems to be lost and all the threads I can find are solutions that do not work for me. The problem: System DSN's not being seen by SSIS. SSIS DSN Not Showing as ODBC Data Source I make the decision to try out the ADO.NET connector...and to my surprise it is actually in the selection list in data sources in SSIS. So I take off running to create a Data Flow Task, create an ADO.NET Source (a local MSSQL DB)...all is good as usual. Then I move swiftly to creating a ADO.NET Destination, enter my credentials...wow, I am selecting a database finally on my linux server! Happy thinking that I finally have figured a way to get the job done. Then I move to mappings...nope, something is wrong...I am getting an error that hurts my eyes: Pipeline component has returned HRESULT error code 0xC0208457 from a a method call. Error at Data Flow Task [ADO NET Destination [81]]: Failed to get properties of external columns. The table name you entered may not exist or you do not have SELECT permission on the table object and an alternative attempt to get column properties through connection has failed. Detailed error messages are" You have an error in your SQL syntax check the manual that corresponds to your MySQL server version for the right syntax to use near "database".tablename" at line 1. The descriptor files on path C:\Program Files (x86)\Microsoft SQL Server\110\DTS\ProviderDescriptors\ does not contain schema information for connection of type MySQL.Data.MySqlClient.MySqlConnection. So it looks like it can't the information and therefore I cannot map the tables properly. Any ideas on this would be ultra helpful...thanks in advance to All!

    Read the article

  • AutoVue at the Oracle Asset Lifecycle Management Summit

    - by celine.beck
    I recently had the opportunity to attend and present the integration between AutoVue and Primavera P6 during the Oracle ALM Summit, which was held in March at Redwood Shores, on Oracle Headquarters grounds. The ALM Summit brought together over 300 Oracle maintenance practitioners who endured the foggy and rainy San Francisco weather to attend the 4th edition of this Oracle-driven conference. Attendees have roles in maintenance management and IT. Following a general session, Ralph Rio from ARC Advisory Group provided a very interesting keynote session discussing Asset Management directions, both in the short and long run. An interesting point that Ralph raised is that most organizations have done a good job at improving performance at the design / build, operate and maintain and portfolio management phases by leveraging solutions like Asset Lifecycle Management and Project & Portfolio management solutions; however, there seem to be room for improvement in between those phases, when information flows from one group to the other, during the data handover phase or when time comes to update / modify drawings to reflect the reality of physical assets. This is where AutoVue comes into play. By integrating with enterprise applications like content management systems, asset lifecycle management applications and project management solutions, AutoVue can be a real-process enabler, streamlining information flows from concept/design to decommissioning and ensuring that all project stakeholders have access to asset information and engineering data throughout the asset lifecycle. AutoVue's built-in digital annotation capabilities allows maintenance workers and technicians to report changes in configuration and visually capture the delta between as-built and as-maintained versions of asset documents. This information can then be easily handed over to engineers who can identify changes and incorporate these modifications into the drawings during the next round of document revisions. PPL Power Generation, an electric utilities headquarted in Allentown, Pennsylvania discussed this usage of AutoVue during an interesting Webcast around AutoVue's role in the Utilities space. After the keynote sessions, participants broke off into product-centric tracks around Oracle's Asset Lifecycle Management solutions (E-Business Suite, PeopleSoft, and JD Edwards). The second day of the conference was the occasion for us to present the integration between AutoVue and Primavera P6 to the Maintenance Summit audience. The presentation was a great success and generated much discussion with partners and customers during breaks. People seemed highly interested in learning more about our plans for integrating AutoVue and Primavera P6 with Oracle's ALM solutions...stay tune for further information on the subject!

    Read the article

  • Guessing Excel Data Types

    - by AjarnMark
    Note to Self HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Jet\4.0\Engines\Excel: TypeGuessRows = 0 means scan everything. Note to Others About 10 years ago I stumbled across this bit of information just when I needed it and it saved my project.  Then for some reason, a few years later when it would have been nice, but not critical, for some reason I could not find it again anywhere.  Well, now I have stumbled across it again, and to preserve my future self from nightmares and sudden baldness due to pulling my hair out, I have decided to blog it in the hopes that I can find it again this way. Here’s the story…  When you query data from an Excel spreadsheet, such as with old-fashioned DTS packages in SQL 2000 (my first reference) or simply with an OLEDB Data Adapter from ASP.NET (recent task) and if you are using the Microsoft Jet 4.0 driver (newer ones may deal with this differently) then you can get funny results where the query reports back that a cell value is null even when you know it contains data. What happens is that Excel doesn’t really have data types.  While you can format information in cells to appear like certain data types (e.g. Date, Time, Decimal, Text, etc.) that is not really defining the cell as being of a certain type like we think of when working with databases.  But, presumably, to make things more convenient for the user (programmer) when you issue a query against Excel, the query processor tries to guess what type of data is contained in each column and returns it in an appropriate manner.  This is all well and good IF your data is consistent in every row and matches what the processor guessed.  And, for efficiency’s sake, when the query processor is trying to figure out each column’s data type, it does so by analyzing only the first 8 rows of data (default setting). Now here’s the problem, suppose that your spreadsheet contains information about clothing, and one of the columns is Size.  Now suppose that in the first 8 rows, all of your sizes look like 32, 34, 18, 10, and so on, using numbers, but then, somewhere after the 8th row, you have some rows with sizes like S, M, L, XL.  What happens is that by examining only the first 8 rows, the query processor inferred that the column contained numerical data, and then when it hits the non-numerical data in later rows, it comes back blank.  Major bummer, and a real pain to track down if you don’t know that Excel is doing this, because you study the spreadsheet and say, “the data is RIGHT THERE!  WHY doesn’t the query see it?!?!”  And the hair-pulling begins. So, what’s a developer to do?  One option is to go to the registry setting noted above and change the DWORD value of TypeGuessRows from the default of 8 to 0 (zero).  Setting this value to zero will force Jet to scan every row in the spreadsheet before making its determination as to what type of data the column contains.  And that means that in the example above, it would have treated the column as a string rather than as numeric, and presto! your query now returns all of the values that you know are in there. Of course, there is a caveat… if you are querying large spreadsheets, making Jet scan every row can be quite a performance hit.  You could enter a different number (more than 8) that you believe is a better sampling of rows to make the guess, but you still have the possibility that every row scanned looks alike, but that later rows are different, and that you might get blanks when there really is data there.  That’s the type of gamble, I really don’t like to take with my data. Anyone with a better approach, or with experience with more recent drivers that have a better way of handling data types, please chime in!

    Read the article

  • What guidelines are best suited for leveraging automatic deployments?

    - by Scott
    We are hoping to leverage a static code analysis tool (Sonar) as part of our continuous integration server, and are hoping to determine some useful guidelines to serve as a base for allowing the deployment to continue. What conditions should we make mandatory before allowing a build to proceed to the next set of testing? The obvious answers include that it compiles and the unit tests are successful. But what are some other things we should require before allowing a build to not be rolled back?

    Read the article

  • SSIS Dashboard 0.5.2 and Live Demo Website

    - by Davide Mauri
    In the last days I’ve worked again on the SQL Server Integration Service Dashboard and I did some updates: Beta Added support for "*" wildcard in project names. Now you can filter a specific project name using an url like: http://<yourserver>/project/MyPro* Added initial support for Package Execution History. Just click on a package name and you'll see its latest 15 executions and I’ve also created a live demo website for all those who want to give it a try before downloading and using it: http://ssis-dashboard.azurewebsites.net/

    Read the article

  • Communicator Messages not being saved to Outlook due to Outlook Integration error

    - by Mark Rogers
    For the most part my Office Communicator appears to be configured correctly. I can login to my work account and see the work contact list. Outlook is working perfectly had a weird profile problem initially but that was fixed. Unfortunately, even though I have set the setting that says: Save my instant message conversations in the Outlook Conversation History folder. My conversations have stopped saving to the Outlook Conversation History folder. Also I have a yellow warning message on top of the server icon next to the status field. When I hover or click on the message, it says there is an Outlook Integration Error. The administrator is having trouble figuring out what is causing it. What can cause Outlook Integration Errors in Communicator and how do I go about trouble shooting them?

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >