Search Results

Search found 58436 results on 2338 pages for 'multipartform data'.

Page 52/2338 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • problem in displays data in one page

    - by user318068
    hi ,,,,, I have a problem in the following code ... The following code works as follows displays the invites for each member so that if he had five invite from supposed to be displayed all on one page But before you code that does not function Proper image is the only display one invite on the page and until the approval or rejection of the invitation displays the invite the other .... But this is not my want to offer all on one page I wish I could solve the problem and I can view all calls in one page I think that the problem is in the order code I think that the problem is in the order code my code : <?php session_start(); if (!isset($_SESSION['user_id'])) { header("Location: login.php"); } $id=$_SESSION['user_id']; ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Untitled Document</title> </head> <body> <center> <?php include("connect.php"); $sql =mysql_query("select * from ninvite where recieverMemberID ='$id' and viwed= '0'"); $num =mysql_num_rows($sql); echo $num ; if ($num>0) { while($row=mysql_fetch_array($sql)) { $sender=$row['SenderMemberID']; $room=$row['RoomID']; $sql =mysql_query("select MemberName from members where MemberID ='$sender' "); $sql1 =mysql_query("select RoomName from rooms where RoomID ='$room' "); while($row=mysql_fetch_array($sql)) {$mem =$row['MemberName']; } while($rows=mysql_fetch_array($sql1)) { $Ro =$rows['RoomName']; ?> <form action="join.php" method="post"> <label> </label> <br/> <label> <?php echo " you have invite from $mem to join $Ro"; ?> </label> <br/><br/> <label>accept</label> <input name="radio1" type="radio" value="accpet" /> <label>reject</label> <input name="radio1" type="radio" value="Reject" /><br/> <input type="submit" name="submit" value="done" /> </form> <?php } } } ?> </center> </body> </html> thanks alot. my SQl -- phpMyAdmin SQL Dump -- version 3.2.4 -- http://www.phpmyadmin.net -- Host: localhost -- Generation Time: May 07, 2010 at 12:50 ? -- Server version: 5.1.41 -- PHP Version: 5.3.1 SET SQL_MODE="NO_AUTO_VALUE_ON_ZERO"; /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT /; /!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS /; /!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION /; /!40101 SET NAMES utf8 */; -- -- Database: tr -- -- Table structure for table joinroom CREATE TABLE IF NOT EXISTS joinroom ( MemberID int(10) NOT NULL, RoomID int(10) NOT NULL, PRIMARY KEY (MemberID,RoomID) ) ENGINE=MyISAM DEFAULT CHARSET=latin1; -- -- Dumping data for table joinroom INSERT INTO joinroom (MemberID, RoomID) VALUES (28, 1); -- -- Table structure for table members CREATE TABLE IF NOT EXISTS members ( MemberID int(10) unsigned NOT NULL AUTO_INCREMENT, MemberName varchar(20) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL, MemberPass varchar(10) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL, MemberEmail varchar(30) CHARACTER SET utf8 COLLATE utf8_bin NOT NULL, MemberLocation text CHARACTER SET utf8 COLLATE utf8_bin NOT NULL, MemberImg text CHARACTER SET utf8 COLLATE utf8_bin NOT NULL, PRIMARY KEY (MemberID) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=34 ; -- -- Dumping data for table members INSERT INTO members (MemberID, MemberName, MemberPass, MemberEmail, MemberLocation, MemberImg) VALUES (28, 'marwa', '1234', '[email protected]', 'mmmmmm', 'dddddddddd'), (29, 'nora', '1234', '[email protected]', 'fffffffffffgg', 'gggggggggggggg'), (30, 'soso', '1234', '[email protected]', 'ffffffff', 'kkkkkkkkkkkkkkkkkk'), (31, 'gege', '1234', '[email protected]', 'kkkkkkkkkkkkkkkk', 'uuuuuuuuuuuuuuuuu'), (32, 'nono', '1234', '[email protected]', 'ggggggggggggaaaaa', 'aaaaaaaaaaaaaaa'), (33, 'nda', '1234', '[email protected]', 'kkkkkkkkkkkkkkkk', 'ooooooooooooooo'); -- -- Table structure for table ninvite CREATE TABLE IF NOT EXISTS ninvite ( SenderMemberID int(11) NOT NULL AUTO_INCREMENT, recieverMemberID varchar(30) NOT NULL, RoomID int(11) NOT NULL, viwed int(11) NOT NULL, PRIMARY KEY (SenderMemberID,recieverMemberID,RoomID) ) ENGINE=MyISAM DEFAULT CHARSET=latin1 AUTO_INCREMENT=33 ; -- -- Dumping data for table ninvite INSERT INTO ninvite (SenderMemberID, recieverMemberID, RoomID, viwed) VALUES (28, '33', 1, 0), (28, '32', 1, 0), (28, '31', 1, 0); /*!40101 SET CHARACTER_SET_CLIENT=@OLD_CHARACTER_SET_CLIENT /; /!40101 SET CHARACTER_SET_RESULTS=@OLD_CHARACTER_SET_RESULTS /; /!40101 SET COLLATION_CONNECTION=@OLD_COLLATION_CONNECTION */;

    Read the article

  • From DBA to Data Analyst

    - by Denise McInerney
    Cross posted from the PASS Blog There is a lot changing in the data professional’s world these days. More data is being produced and stored. More enterprises are trying to use that data to improve their products and services and understand their customers better. More data platforms and tools seem to be crowding the market. For a traditional DBA this can be a confusing and perhaps unsettling time. It’s also a time that offers great opportunity for career growth. I speak from personal experience. We sometimes refer to the “accidental DBA”, the person who finds herself suddenly responsible for managing the database because she has some other technical skills. While it was not accidental, six months ago I was unexpectedly offered a chance to transition out of my DBA role and become a data analyst. I have since come to view this offer as a gift, though at the time I wasn’t quite sure what to do with it. Throughout my DBA career I’ve gotten support from my PASS friends and colleagues and they were the first ones I turned to for counsel about this new situation. Everyone was encouraging and I received two pieces of valuable advice: first, leverage what I already know about data and second, work to understand the business’ needs. Bringing the power of data to bear to solve business problems is really the heart of the job. The challenge is figuring out how to do that. PASS had been the source of much of my technical training as a DBA, so I naturally started there to begin my Business Intelligence education. Once again the Virtual Chapter webinars, local chapter meetings and SQL Saturdays have been invaluable. I work in a large company where we are fortunate to have some very talented data scientists and analysts. These colleagues have been generous with their time and advice. I also took a statistics class through Coursera where I got a refresher in statistics and an introduction to the R programming language. And that’s not the end of the free resources available to someone wanting to acquire new skills. There are many knowledgeable Business Intelligence and Analytics professionals who teach through their blogs. Every day I can learn something new from one of these experts. Sometimes we plan our next career move and sometimes it just happens. Either way a database professional who follows industry developments and acquires new skills will be better prepared when change comes. Take the opportunity to learn something about the changing data landscape and attend a Business Intelligence, Business Analytics or Big Data Virtual Chapter meeting. And if you are moving into this new world of data consider attending the PASS Business Analytics Conference in April where you can meet and learn from those who are already on that road. It’s been said that “the only thing constant is change.” That’s never been more true for the data professional than it is today. But if you are someone who loves data and grasps its potential you are in the right place at the right time.

    Read the article

  • Blink-Data vs Instinct?

    - by Samantha.Y. Ma
    In his landmark bestseller Blink, well-known author and journalist Malcolm Gladwell explores how human beings everyday make seemingly instantaneous choices --in the blink of an eye--and how we “think without thinking.”  These situations actually aren’t as simple as they seem, he postulates; and throughout the book, Gladwell seeks answers to questions such as: 1.    What makes some people good at thinking on their feet and making quick spontaneous decisions?2.    Why do some people follow their instincts and win, while others consistently seem to stumble into error?3.    Why are some of the best decisions often those that are difficult to explain to others?In Blink, Gladwell introduces us to the psychologist who has learned to predict whether a marriage will last, based on a few minutes of observing a couple; the tennis coach who knows when a player will double-fault before the racket even makes contact with the ball; the antiquities experts who recognize a fake at a glance. Ultimately, Blink reveals that great decision makers aren't those who spend the most time deliberating or analyzing information, but those who focus on key factors among an overwhelming number of variables-- i.e., those who have perfected the art of "thin-slicing.” In Data vs. Instinct: Perfecting Global Sales Performance, a new report sponsored by Oracle, the Economist Intelligence Unit (EIU) explores the roles data and instinct play in decision-making by sales managers and discusses how sales executives can increase sales performance through more effective  territory planning and incentive/compensation strategies.If you are a sales executive, ask yourself this:  “Do you rely on knowledge (data) when you plan out your sales strategy?  If you rely on data, how do you ensure that your data sources are reliable, up-to-date, and complete?  With the emergence of social media and the proliferation of both structured and unstructured data, how do you know that you are applying your information/data correctly and in-context?  Three key findings in the report are:•    Six out of ten executives say they rely more on data than instinct to drive decisions. •    Nearly one half (48 percent) of incentive compensation plans do not achieve the desired results. •    Senior sales executives rely more on current and historical data than on forecast data. Strikingly similar to what Gladwell concludes in Blink, the report’s authors succinctly sum up their findings: "The best outcome is a combination of timely information, insightful predictions, and support data."Applying this insight is crucial to creating a sound sales plan that drives alignment and results.  In the area of sales performance management, “territory programs and incentive compensation continue to present particularly complex challenges in an increasingly globalized market," say the report’s authors. "It behooves companies to get a better handle on translating that data into actionable and effective plans." To help solve this challenge, CRM Oracle Fusion integrates forecasting, quotas, compensation, and territories into a single system.   For example, Oracle Fusion CRM provides a natural integration between territories, which define the sales targets (e.g., collection of accounts) for the sales force, and quotas, which quantify the sales targets. In fact, territory hierarchy is a core analytic dimension to slice and dice sales results, using sales analytics and alerts to help you identify where problems are occurring. This makes territoriesStart tapping into both data and instinct effectively today with Oracle Fusion CRM.   Here is a short video to provide you with a snapshot of how it can help you optimize your sales performance.  

    Read the article

  • Reaching to the Holy Grail of Data Management

    - by Irem Radzik
    Pervasive, continuous access to trusted data. That’s the ultimate goal of data management. It enables to leverage data as an asset to create value for customers and the organization. It creates the strong foundation needed to move the business forward. How you get there is also critical. As with all IT initiatives using high performance solutions with low cost of ownership is another key requirement in today’s IT world. Oracle's  data integration product strategy focuses on helping customers achieve this ultimate goal with high performance and low TCO.  At OpenWorld, we will be showing how Oracle Data Integration products help you reach your data management goals, considering new trends in information management, such as big data and cloud computing. We will also provide an update on the latest product releases, such as Oracle GoldenGate 11gR2. If you will be at OpenWorld, please join us on Monday Oct 1st 10:45am at Moscone West – 3005 to hear our VP of Product Development, Brad Adelberg, present "Future Strategy, Direction, and Roadmap of Oracle’s Data Integration Platform". The Data Integration track at OpenWorld covers variety of topics and speakers. In addition to product management of Oracle GoldenGate, Oracle Data Integrator, and Enteprise Data Quality presenting product updates and roadmap, we have several customer panels and stand-alone sessions featuring select customers such as St. Jude Medical, Raymond James, Aderas, Turkcell, Paychex, Comcast,  Ticketmaster, Bank of America and more. You can see an overview of Data Integration sessions here. If you are not able to attend OpenWorld, please check out our latest resources for Data Integration and Oracle GoldenGate. In the coming weeks you will see more blogs about our products’ new capabilities and what to expect at OpenWorld. I hope to see you at OpenWorld and stay in touch via our future blogs. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

    Read the article

  • Core Data migration failing with error: Failed to save new store after first pass of migration

    - by unforgiven
    In the past I had already implemented successfully automatic migration from version 1 of my data model to version 2. Now, using SDK 3.1.3, migrating from version 2 to version 3 fails with the following error: Unresolved error Error Domain=NSCocoaErrorDomain Code=134110 UserInfo=0x5363360 "Operation could not be completed. (Cocoa error 134110.)", { NSUnderlyingError = Error Domain=NSCocoaErrorDomain Code=256 UserInfo=0x53622b0 "Operation could not be completed. (Cocoa error 256.)"; reason = "Failed to save new store after first pass of migration."; } I have tried automatic migration using NSMigratePersistentStoresAutomaticallyOption and NSInferMappingModelAutomaticallyOption and also migration using only NSMigratePersistentStoresAutomaticallyOption, providing a mapping model from v2 to v3. I see the above error logged, and no object is available in the application. However, if I quit the application and reopen it, everything is in place and working. The Core Data methods I am using are the following ones - (NSManagedObjectModel *)managedObjectModel { if (managedObjectModel != nil) { return managedObjectModel; } NSString *path = [[NSBundle mainBundle] pathForResource:@"MYAPP" ofType:@"momd"]; NSURL *momURL = [NSURL fileURLWithPath:path]; managedObjectModel = [[NSManagedObjectModel alloc] initWithContentsOfURL:momURL]; return managedObjectModel; } - (NSManagedObjectContext *) managedObjectContext { if (managedObjectContext != nil) { return managedObjectContext; } NSPersistentStoreCoordinator *coordinator = [self persistentStoreCoordinator]; if (coordinator != nil) { managedObjectContext = [[NSManagedObjectContext alloc] init]; [managedObjectContext setPersistentStoreCoordinator: coordinator]; } return managedObjectContext; } - (NSPersistentStoreCoordinator *)persistentStoreCoordinator { if (persistentStoreCoordinator != nil) { return persistentStoreCoordinator; } NSURL *storeUrl = [NSURL fileURLWithPath: [[self applicationDocumentsDirectory] stringByAppendingPathComponent: @"MYAPP.sqlite"]]; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, [NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil]; NSError *error = nil; persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel: [self managedObjectModel]]; if (![persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeUrl options:options error:&error]) { // Handle error NSLog(@"Unresolved error %@, %@", error, [error userInfo]); } return persistentStoreCoordinator; } In the simulator, I see that this generates a MYAPP~.sqlite files and a MYAPP.sqlite file. I tried to remove the MYAPP~.sqlite file, but BOOL oldExists = [[NSFileManager defaultManager] fileExistsAtPath: [[self applicationDocumentsDirectory] stringByAppendingPathComponent: @"MYAPP~.sqlite"]]; always returns NO. Any clue? Am I doing something wrong? Thank you in advance.

    Read the article

  • ABPerson in Core Data

    - by eman
    I'm trying to figure out to store a reference to an ABPerson in a Core Data store on an iPhone app. Ultimately, I'd like to be able to sync with a Mac version of the app (I'm assuming ABRecordIDs wouldn't be the same for the iPhone and the Mac). I was thinking of storing the record ID, name, and email and checking against those--is there a better way?

    Read the article

  • Core Data - Calculated Fields

    - by Jacob
    Hi, I Know how to use core data with UITableview but how can I use the NSFetchedController to get calculated fields. Is there an example I can follow? LIke i want to go through all the NSManagedObjects and then add its "mark" field but can this be done in easier way or do I have to do it all manually. Thanks

    Read the article

  • System.Data.SQLite parameter issue

    - by CasperT
    I have the following code: try { //Create connection SQLiteConnection conn = DBConnection.OpenDB(); //Verify user input, normally you give dbType a size, but Text is an exception var uNavnParam = new SQLiteParameter("@uNavnParam", SqlDbType.Text) { Value = uNavn }; var bNavnParam = new SQLiteParameter("@bNavnParam", SqlDbType.Text) { Value = bNavn }; var passwdParam = new SQLiteParameter("@passwdParam", SqlDbType.Text) {Value = passwd}; var pc_idParam = new SQLiteParameter("@pc_idParam", SqlDbType.TinyInt) { Value = pc_id }; var noterParam = new SQLiteParameter("@noterParam", SqlDbType.Text) { Value = noter }; var licens_idParam = new SQLiteParameter("@licens_idParam", SqlDbType.TinyInt) { Value = licens_id }; var insertSQL = new SQLiteCommand("INSERT INTO Brugere (navn, brugernavn, password, pc_id, noter, licens_id)" + "VALUES ('@uNameParam', '@bNavnParam', '@passwdParam', '@pc_idParam', '@noterParam', '@licens_idParam')", conn); insertSQL.Parameters.Add(uNavnParam); //replace paramenter with verified userinput insertSQL.Parameters.Add(bNavnParam); insertSQL.Parameters.Add(passwdParam); insertSQL.Parameters.Add(pc_idParam); insertSQL.Parameters.Add(noterParam); insertSQL.Parameters.Add(licens_idParam); insertSQL.ExecuteNonQuery(); //Execute query //Close connection DBConnection.CloseDB(conn); //Let the user know that it was changed succesfully this.Text = "Succes! Changed!"; } catch(SQLiteException e) { //Catch error MessageBox.Show(e.ToString(), "ALARM"); } It executes perfectly, but when I view my "brugere" table, it has inserted the values: '@uNameParam', '@bNavnParam', '@passwdParam', '@pc_idParam', '@noterParam', '@licens_idParam' literally. Instead of replacing them. I have tried making a breakpoint and checked the parameters, they do have the correct assigned values. So that is not the issue either. I have been tinkering with this a lot now, with no luck, can anyone help? Oh and for reference, here is the OpenDB method from the DBConnection class: public static SQLiteConnection OpenDB() { try { //Gets connectionstring from app.config const string myConnectString = "data source=data;"; var conn = new SQLiteConnection(myConnectString); conn.Open(); return conn; } catch (SQLiteException e) { MessageBox.Show(e.ToString(), "ALARM"); return null; } }

    Read the article

  • Asp.net Dynamic Data - Field not rendered on List page

    - by Christo Fur
    I have created a Dynamic Data site against an Entity Framework Model I have 2 fields which are nvarchar(max) in the DB and they do not get rendered on the list view This is probably a sensible default But how do I overide this? Have tried adding various attributes to my MetaData class e.g [ScaffoldColumn(true)] [UIHint("RuleData")] But no joy with that Any ideas?

    Read the article

  • Sharing data between graphics and physics engine in the game?

    - by PolGraphic
    I'm writing the game engine that consists of few modules. Two of them are the graphics engine and the physics engine. I wonder if it's a good solution to share data between them? Two ways (sharing or not) looks like that: Without sharing data GraphicsModel{ //some common for graphics and physics data like position //some only graphic data //like textures and detailed model's verticles that physics doesn't need }; PhysicsModel{ //some common for graphics and physics data like position //some only physics data //usually my physics data contains A LOT more informations than graphics data } engine3D->createModel3D(...); physicsEngine->createModel3D(...); //connect graphics and physics data //e.g. update graphics model's position when physics model's position will change I see two main problems: A lot of redundant data (like two positions for both physics and graphics data) Problem with updating data (I have to manually update graphics data when physics data changes) With sharing data Model{ //some common for graphics and physics data like position }; GraphicModel : public Model{ //some only graphics data //like textures and detailed model's verticles that physics doesn't need }; PhysicsModel : public Model{ //some only physics data //usually my physics data contains A LOT more informations than graphics data } model = engine3D->createModel3D(...); physicsEngine->assingModel3D(&model); //will cast to //PhysicsModel for it's purposes?? //when physics changes anything (like position) in model //(which it treats like PhysicsModel), the position for graphics data //will change as well (because it's the same model) Problems here: physicsEngine cannot create new objects, just "assing" existing ones from engine3D (somehow it looks more anti-independent for me) Casting data in assingModel3D function physicsEngine and graphicsEngine must be careful - they cannot delete data when they don't need them (because second one may need it). But it's rare situation. Moreover, they can just delete the pointer, not the object. Or we can assume that graphicsEngine will delete objects, physicsEngine just pointers to them. Which way is better? Which will produce more problems in the future? I like the second solution more, but I wonder why most graphics and physics engines prefer the first one (maybe because they normally make only graphics or only physics engine and somebody else connect them in the game?). Have they any more hidden pros & contras?

    Read the article

  • Getting Data from WinForms ListView Control

    - by James
    I need to retrieve my data from a ListView control set up in Details mode with 5 columns. I tried using this code: MessageBox.Show(ManageList.SelectedItems(0).Text) And it works, but only for the first selected item (item 0). If I try this: MessageBox.Show(ManageList.SelectedItems(2).Text) I get this error: InvalidArgument=Value of '2' is not valid for 'index'. Parameter name: index I have no clue how I can fix this, any help? Edit: Sorry, should have said, I'm using Windows.Forms :)

    Read the article

  • Export Core Data Entity as text files in Cocoa

    - by happyCoding25
    Hello, I have an entity in core data that has 2 Attributes. One that is a string called "name", and another one that is a string called "message". I need a method to create text files for all the attributes that the user has added. I wan't the files names to be the name attribute and the contents to be the message attribute. If anyone knows how to do this any help would be great. Thanks for any help

    Read the article

  • Dynamic Data: how to filter dropdown for foreign key on edit page

    - by Leonv
    I have Organisation with a foreign key to a Manager. Managers can be active, or inactive. On the Dynamic Data edit page for Organisation, I need to filter the dropdown for Manager to only show active records. I started out by making a custom version of DynamicData\FieldTemplates\ForeignKey_Edit.ascx and setting a UIHint to the new field template on Organisation.Manager. But, how to customize the linq or sql query that runs to load the Managers? Using Linq-to-SQL and DynamicDataFutures

    Read the article

  • dynamic data - all option in paging

    - by Sharique
    I'm developing a dynamic data web app. On list.aspx page it has GridViewPager control for paging, it option in drop down list as 10,20,.. rows in a page like that, but it does not an option show all rows in a page. How I can add "All" option in it?

    Read the article

  • Dynamic Data with Subsonic 3

    - by Ezequiel Bertti
    i want to make a webproject with Subsonic and Dynamic Data... But when i go register the ContextData a don't have it in subsonic with LINQ... in Global.asax.cs a have to do something like this model.RegisterContext(SubSonicRepo, new ContextConfiguration() { ScaffoldAllTables = true }); how can i make it work? have some way to make it work? using entities or LINQ everything work... but using Subsonic with linq it not work...

    Read the article

  • A good web data extraction/screen scraper program?

    - by Taylor
    I need to capture product data from a site on a regular basis and wondered if any one knows of a good software program? I've trialed Mozenda but its a monthly subscription and pricey in the long term. Obviously something thats free would be best but I don't mind paying either. Just need a decent program thats reliable and doesn't require much programming knowledge.

    Read the article

  • I keep on getting "save operation failure" after any change on my XCode Data Model

    - by Philip Schoch
    I started using Core Data for iPhone development. I started out by creating a very simple entity (called Evaluation) with just one string property (called evaluationTopic). I had following code for inserting a fresh string: - (void)insertNewObject { // Create a new instance of the entity managed by the fetched results controller. NSManagedObjectContext *context = [fetchedResultsController managedObjectContext]; NSEntityDescription *entity = [[fetchedResultsController fetchRequest] entity]; NSManagedObject *newManagedObject = [NSEntityDescription insertNewObjectForEntityForName:[entity name] inManagedObjectContext:context]; // If appropriate, configure the new managed object. [newManagedObject setValue:@"My Repeating String" forKey:@"evaluationTopic"]; // Save the context. NSError *error; if (![context save:&error]) { // Handle the error... } [self.tableView reloadData]; } This worked perfectly fine and by pushing the +button a new "My Repeating String" would be added to the table view and be in persistent store. I then pressed "Design - Add Model Version" in XCode. I added three entities to the existing entity and also added new properties to the existing "Evaluation" entity. Then, I created new files off the entities by pressing "File - New File - Managed Object Classes" and created a new .h and .m file for my four entities, including the "Evaluation" entity with Evaluation.h and Evaluation.m. Now I changed the model version by setting "Design - Data Model - Set Current Version". After having done all this, I changed my insertMethod: - (void)insertNewObject { // Create a new instance of the entity managed by the fetched results controller. NSManagedObjectContext *context = [fetchedResultsController managedObjectContext]; NSEntityDescription *entity = [[fetchedResultsController fetchRequest] entity]; Evaluation *evaluation = (Evaluation *) [NSEntityDescription insertNewObjectForEntityForName:[entity name] inManagedObjectContext:context]; // If appropriate, configure the new managed object. [evaluation setValue:@"My even new string" forKey:@"evaluationSpeechTopic"]; // Save the context. NSError *error; if (![context save:&error]) { // Handle the error... } [self.tableView reloadData]; } This does not work though! Every time I want to add a row the simulator crashes and I get the following: "NSInternalInconsistencyException', reason: 'This NSPersistentStoreCoordinator has no persistent stores. It cannot perform a save operation.'" I had this error before I knew about creating new version after changing anything on the datamodel, but why is this still coming up? Do I need to do any mapping (even though I just added entities and properties that did not exist before?). In the Apple Dev tutorial it sounds very easy but I have been struggling with this for long time, never worked after changing model version.

    Read the article

  • Data Flow Diagram

    - by Nilesh
    Can anyone help me to draw a data flow diagram for a travel request form for a company in which an employee can request for travel and request approval by his/her by project manager and HR department. Regards Nils

    Read the article

  • Invert a stack, without using extra data structures?

    - by vks
    How would you invert a stack, without using extra data structures, like a second, or temporary, stack. Thus no stack1-stack2 or stack-queue-stack implementation in the answer. You just have access to push/pop feature of a standard stack. I think there is way to do it by keeping a global counter and using pointer manipulation. If I solve it myself, I will post it. If someone else figures it out, please post your solution.

    Read the article

  • C# TCP socket and binary data

    - by MD
    Hi @All How to send binary data (01110110 for exemple) with C# throught a TCP (using SSL) socket ? I'm using : SslStream.Write() and h[0] = (byte)Convert.ToByte("01110110"); isn't working Thanks.

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >