Search Results

Search found 58653 results on 2347 pages for 'seed data'.

Page 34/2347 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • Propel-load-data is causing an error

    - by Jon Winstanley
    I am trying to load fixtures but myproject is erroring at the CLI and starting the indexer process. I have tried: Rebuilding the schema and model Emptying the database and starting again Clearing the cache Validating the YML file and trying much simpler data-dumps My platform is Symfony 1.0 on Windows Some also seems to have had the same issue in the past. C:\web\my_project>symfony propel-load-data backend >> propel load data from "C:\web\my_project\data\fixtures" PHP Warning: session_start(): Cannot send session cookie - headers already sent by (output started at C:\php\PEAR\symfony\vendor\pake\pakeFunction.php:366) in C:\php\PEAR\symfony\storage\sfSessionStorage.class.php on line 77 Warning: session_start(): Cannot send session cookie - headers already sent by (output started at C:\php\PEAR\symfony\vendor\pake\pakeFunction.php:366) in C:\php\PEAR\symfony\storage\sfSessionStorage.class.php on line 77 PHP Warning: session_start(): Cannot send session cache limiter - headers already sent (output started at C:\php\PEAR\symfony\vendor\pake\pakeFunction.php:366) in C:\php\PEAR\symfony\storage\sfSessionStorage.class.php on line 77 Warning: session_start(): Cannot send session cache limiter - headers already sent (output started at C:\php\PEAR\symfony\vendor\pake\pakeFunction.php:366) in C:\php\PEAR\symfony\storage\sfSessionStorage.class.php on line 77

    Read the article

  • How does LinqPad support WCF Data Services?

    - by user341127
    LinqPad supports WCF Data Services. If you assign an URL, such as http://services.odata.org/Northwind/Northwind.svc/. It will list all available data objects and you can query them. I guess LinqPad generates all available data classes at run time by reflection.Emit. I am wondering who can show me to how to do so. Or maybe someone has done it before. Any feedback are appreciated. Ying

    Read the article

  • Test data generators / quickest route to generating solid, non-repetitive, but not-real database sam

    - by Jamo
    I need to build a quick feasibility test / proof-of-concept of a remote database for a client, that will be populated with mostly-typical Company and People data (names, addresses, etc); 150K records or so. The sample databases mentioned here were helpful: http://stackoverflow.com/questions/57068/good-databases-with-sample-data ...but, I'd like to be able to generate sample data like this easily on less-typical datasets as well. Anyone have any recommendations for off-the-shelf (or off-the-web) solutions?

    Read the article

  • Resources related to data-mining and gaming on social networks

    - by darren
    Hi all I'm interested in the problem of patterning mining among players of social networking games. For example detecting cheaters of a game, given a company's user database. So far I have been following the usual recipe for a data mining project: construct a data warehouse that aggregates significant information select a classifier, and train it with a subsectio of records from the warehouse validate classifier with another test set lather, rinse, repeat Surprisingly, I've found very little in this area regarding literature, best practices, etc. I am hoping to crowdsource the information gathering problem here. Specifically what I'm looking for: What classifiers have worked will for this type of pattern mining (it seems highly temporal, users playing games, users receiving rewards, users transferring prizes etc). Are there any highly agreed upon attributes specific to social networking / gaming data? What is a practical amount of information that should be considered? One problem I've run into is data overload, where queries and data cleansing may take days to complete. Related to point above, what hardware resources are required to produce results? I've found it difficult to estimate the amount of computing power I will require for production use. It has become apparent that a white box in the corner does not have enough horse-power for such a project. Are companies generally resorting to cloud solutions? Are they buying clusters? Basically, any resources (theoretical, academic, or practical) about implementing a social networking / gaming pattern-mining program would be very much appreciated. Thanks.

    Read the article

  • improve my code for collapsing a list of data.frames

    - by romunov
    Dear StackOverFlowers (flowers in short), I have a list of data.frames (walk.sample) that I would like to collapse into a single (giant) data.frame. While collapsing, I would like to mark (adding another column) which rows have came from which element of the list. This is what I've got so far. This is the data.frame that needs to be collapsed/stacked. > walk.sample [[1]] walker x y 1073 3 228.8756 -726.9198 1086 3 226.7393 -722.5561 1081 3 219.8005 -728.3990 1089 3 225.2239 -727.7422 1032 3 233.1753 -731.5526 [[2]] walker x y 1008 3 205.9104 -775.7488 1022 3 208.3638 -723.8616 1072 3 233.8807 -718.0974 1064 3 217.0028 -689.7917 1026 3 234.1824 -723.7423 [[3]] [1] 3 [[4]] walker x y 546 2 629.9041 831.0852 524 2 627.8698 873.3774 578 2 572.3312 838.7587 513 2 633.0598 871.7559 538 2 636.3088 836.6325 1079 3 206.3683 -729.6257 1095 3 239.9884 -748.2637 1005 3 197.2960 -780.4704 1045 3 245.1900 -694.3566 1026 3 234.1824 -723.7423 I have written a function to add a column that denote from which element the rows came followed by appending it to an existing data.frame. collapseToDataFrame <- function(x) { # collapse list to a dataframe with a twist walk.df <- data.frame() for (i in 1:length(x)) { n.rows <- nrow(x[[i]]) if (length(x[[i]])>1) { temp.df <- cbind(x[[i]], rep(i, n.rows)) names(temp.df) <- c("walker", "x", "y", "session") walk.df <- rbind(walk.df, temp.df) } else { cat("Empty list", "\n") } } return(walk.df) } > collapseToDataFrame(walk.sample) Empty list Empty list walker x y session 3 1 -604.5055 -123.18759 1 60 1 -562.0078 -61.24912 1 84 1 -594.4661 -57.20730 1 9 1 -604.2893 -110.09168 1 43 1 -632.2491 -54.52548 1 1028 3 240.3905 -724.67284 1 1040 3 232.5545 -681.61225 1 1073 3 228.8756 -726.91980 1 1091 3 209.0373 -740.96173 1 1036 3 248.7123 -694.47380 1 I'm curious whether this can be done more elegantly, with perhaps do.call() or some other more generic function?

    Read the article

  • VBA-Sorting the data in a listbox, sort works but data in listbox not changed

    - by Mike Clemens
    A listbox is passed, the data placed in an array, the array is sort and then the data is placed back in the listbox. The part that does work is putting the data back in the listbox. Its like the listbox is being passed by value instead of by ref. Here's the sub that does the sort and the line of code that calls the sort sub. Private Sub SortListBox(ByRef LB As MSForms.ListBox) Dim First As Integer Dim Last As Integer Dim NumItems As Integer Dim i As Integer Dim j As Integer Dim Temp As String Dim TempArray() As Variant ReDim TempArray(LB.ListCount) First = LBound(TempArray) ' this works correctly Last = UBound(TempArray) - 1 ' this works correctly For i = First To Last TempArray(i) = LB.List(i) ' this works correctly Next i For i = First To Last For j = i + 1 To Last If TempArray(i) > TempArray(j) Then Temp = TempArray(j) TempArray(j) = TempArray(i) TempArray(i) = Temp End If Next j Next i ! data is now sorted LB.Clear ! this doesn't clear the items in the listbox For i = First To Last LB.AddItem TempArray(i) ! this doesn't work either Next i End Sub Private Sub InitializeForm() ' There's code here to put data in the list box Call SortListBox(FieldSelect.CompleteList) End Sub Thanks for your help.

    Read the article

  • Problem with core data migration mapping model

    - by dpratt
    I have an iphone app that uses Core Data to do storage. I have successfully deployed it, and now I'm working on the second version. I've run into a problem with the data model that will require a few very simple data transformations at the time that the persistent store gets upgraded, so I can't just use the default inferred mapping model. My object model is stored in an .xcdatamodeld bundle, with versions 1.0 and 1.1 next to each other. Version 1.1 is set as the active version. Everything works fine when I use the default migration behavior and set NSInferMappingModelAutomaticallyOption to YES. My sqlite storage gets upgraded from the 1.0 version of the model, and everything is good except for, of course, the few transformations I need done. As an additional experimental step, I added a new Mapping Model to the core data model bundle, and have made no changes to what xcode generated. When I run my app (with an older version of the data store), I get the following * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Object's persistent store is not reachable from this NSManagedObjectContext's coordinator' What am I doing wrong? Here's my code for to get the managed object model and the persistent store coordinator. - (NSPersistentStoreCoordinator *)persistentStoreCoordinator { if (_persistentStoreCoordinator != nil) { return _persistentStoreCoordinator; } _persistentStoreCoordinator = [[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:[self managedObjectModel]]; NSURL *storeUrl = [NSURL fileURLWithPath: [[self applicationDocumentsDirectory] stringByAppendingPathComponent: @"gti_store.sqlite"]]; NSError *error; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], NSMigratePersistentStoresAutomaticallyOption, [NSNumber numberWithBool:YES], NSInferMappingModelAutomaticallyOption, nil]; if (![_persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:nil URL:storeUrl options:options error:&error]) { NSLog(@"Eror creating persistent store coodinator - %@", [error localizedDescription]); } return _persistentStoreCoordinator; } - (NSManagedObjectModel *)managedObjectModel { if(_managedObjectModel == nil) { _managedObjectModel = [[NSManagedObjectModel mergedModelFromBundles:nil] retain]; NSDictionary *entities = [_managedObjectModel entitiesByName]; //add a sort descriptor to the 'Foo' fetched property so that it can have an ordering - you can't add these from the graphical core data modeler NSEntityDescription *entity = [entities objectForKey:@"Foo"]; NSFetchedPropertyDescription *fetchedProp = [[entity propertiesByName] objectForKey:@"orderedBar"]; NSSortDescriptor* sortDescriptor = [[[NSSortDescriptor alloc] initWithKey:@"index" ascending:YES] autorelease]; NSArray* sortDescriptors = [NSArray arrayWithObjects:sortDescriptor, nil]; [[fetchedProp fetchRequest] setSortDescriptors:sortDescriptors]; } return _managedObjectModel; }

    Read the article

  • Generic Data Structure Description Language

    - by Jon Purdy
    I am wondering whether there exists any declarative language for arbitrarily describing the format and semantics of a data structure, that can be compiled to a specific implementation of that structure in any of a set of target languages. That is, something like a generic data definition language but geared toward describing arbitrary data structures such as vectors, lists, trees, etc., and the semantics of operations on those structures. I ask because I had an idea for a feasible implementation of this concept, and I'm just wondering whether it's worth it, and, consequently, whether it's been done before. Another, slightly more abstract question: is there any real difference between the normative specification of a data structure (what it does) and its implementation (how it does it)?

    Read the article

  • How does cobol store and retrieve data?

    - by controlfreak123
    I'm starting to learn about COBOL. I have some experience writing programs that deal with sql databases and I guess I'm confused how cobol stores and retrieves data that is stored in a mainframe for example. I know that it's not like relational databases but every example program I've seen takes data straight from the command line and I know thats not how real world COBOL programs process the data. Can someone explain or show me a good resource that can explain it?

    Read the article

  • Visibility of Class field-data of Mouse Clicked ImageButton located within WrapPanel

    - by Bill
    I am attempting to obtain the class-data behind an ImageButton that is mouse-clicked; which ImageButton is located within a WrapPanel filled with ImageButtons. The problem I am having is obtaining the visibility of the field data within the class behind the image-button. Although I can see the class, I can neither see nor access the field data. Can anyone please point me in the right direction? // Handles the ImageButton mouseClick event within the WrapPanel. private void SolarSystem_Click(Object sender, RoutedEventArgs e) { FrameworkElement fe = e.OriginalSource as FrameworkElement; SelectedPlanet PlanetSelected = new SelectedPlanet(fe); PlanetSelected.Owner = this; MessageBox.Show(PlanetSelected.PlanetName); } // Used to initiate instance of Class and some field data. public SelectedPlanet(FrameworkElement fe) { InitializeComponent(); string sPlanetName = ((PlanetClass)(fe)).PlanetName; return sPlanetName } // Class Data public class PlanetClass { string planetName; public PlanetClass(string planetName) { PlanetName = planetName; } public string PlanetName { set { planetName = value; } get { return planetName; } } }

    Read the article

  • Core Data vs. SQLitePersistentObjects

    - by Macatomy
    I'm creating an iPhone app and I'm trying to choose between 2 solutions for a persistent store. Core Data, or SQLitePersistentObjects. Basically, all my app needs is a way to store an array of model objects and then load them again to display in a UITableView. Its nothing too complicated. Core Data seems to have a much higher learning curve than the simple to use SQLitePersistentObjects. Are there any obvious benefits of using Core Data over SQLitePersistentObjects in my case?

    Read the article

  • Build ATOM Feed Reader for ADO.net DATA Services feed

    - by khalil
    Hi, I have built an ADO.net data services to expose data in a SQL server database as XML. What I want to be able to do is create a feed reader for this ATOM feed in .net or may be a user control which subscribes to this URI based ATOM Feed from ADO.net data service & publishes the latest information on our website

    Read the article

  • How to return plain XML from ADO.NET data service

    - by KHALIL
    Hi, I was wondering how to return plain XML from ADO.net data services I have exposed an ADO.net data service to different DEPARTMENTS in our company who are not so technical. The data returned is ATOM FEED which is kind a hard to read / interpret with its format, too much information is returned people from various departments would execute different queries ( HTTP Request) and i wanted them to display simple XML or atleast something more user friendly like HTML I have tried ACCEPT attribute of the request to be plain XML and it still returns ATOM Thanks -- Khalil

    Read the article

  • Data Collection (Offline - no internet) and then syncing it to generate reports from server

    - by Nishant
    So, I have a new project I am planning on taking, and needed to know what skills will be required to achieve this project. The project is to do intensive data collection in the field where they don't have internet access. As part of the data collection, images will be uploaded as part of the data collection which will have to be resized, etc. Once the data collection occurs, this data needs to be consolidated and reported on. I am thinking there are two ways of generating the report. 1. Into a PDF that can be designed. 2. Is there a way to generate an executable file (since the PDF will be huge due to multiple images, etc) and the executable file is navigation friendly with drop-downs etc. It might not be an executable file, but could be a web page or some way that this can be delivered to the client in a friendly professional way. The PDF will have to be generated somehow so that it can be printed as a hard copy. What languages and skill sets will I need to accomplish this project?

    Read the article

  • Can Core Data be used on Linux?

    - by glenc
    This might be a stupid question, but I was wondering whether or not you can use the Core Data libraries on Linux at all? I'm planning how to build the server side of an iPhone app that I'm working on, and have found that you can use PyObjC to get access to Core Data in a Python environment, e.g. use Core Data in a TurboGears web application. At this point I'm thinking that you would have to run the web server on Mac OSX, because I can't find any evidence on the internet that you can access the Objective-C libraries on Linux. I've always written webapps on Linux but will obviously make the jump to an OSX server if it allows me to use the same datastore implementation on the iPhone and the server, the only job remaining being the Core Data <- Web Services XML translation that has to happen on the wire.

    Read the article

  • Parse and transform XML with missing elements into table structure

    - by dnlbrky
    I'm trying to parse an XML file. A simplified version of it looks like this: x <- '<grandparent><parent><child1>ABC123</child1><child2>1381956044</child2></parent><parent><child2>1397527137</child2></parent><parent><child3>4675</child3></parent><parent><child1>DEF456</child1><child3>3735</child3></parent><parent><child1/><child3>3735</child3></parent></grandparent>' library(XML) xmlRoot(xmlTreeParse(x)) ## <grandparent> ## <parent> ## <child1>ABC123</child1> ## <child2>1381956044</child2> ## </parent> ## <parent> ## <child2>1397527137</child2> ## </parent> ## <parent> ## <child3>4675</child3> ## </parent> ## <parent> ## <child1>DEF456</child1> ## <child3>3735</child3> ## </parent> ## <parent> ## <child1/> ## <child3>3735</child3> ## </parent> ## </grandparent> I'd like to transform the XML into a data.frame / data.table that looks like this: parent <- data.frame(child1=c("ABC123",NA,NA,"DEF456",NA), child2=c(1381956044, 1397527137, rep(NA, 3)), child3=c(rep(NA, 2), 4675, 3735, 3735)) parent ## child1 child2 child3 ## 1 ABC123 1381956044 NA ## 2 <NA> 1397527137 NA ## 3 <NA> NA 4675 ## 4 DEF456 NA 3735 ## 5 <NA> NA 3735 If each parent node always contained all of the possible elements ("child1", "child2", "child3", etc.), I could use xmlToList and unlist to flatten it, and then dcast to put it into a table. But the XML often has missing child elements. Here is an attempt with incorrect output: library(data.table) ## Flatten: dt <- as.data.table(unlist(xmlToList(x)), keep.rownames=T) setnames(dt, c("column", "value")) ## Add row numbers, but they're incorrect due to missing XML elements: dt[, row:=.SD[,.I], by=column][] column value row 1: parent.child1 ABC123 1 2: parent.child2 1381956044 1 3: parent.child2 1397527137 2 4: parent.child3 4675 1 5: parent.child1 DEF456 2 6: parent.child3 3735 2 7: parent.child3 3735 3 ## Reshape from long to wide, but some value are in the wrong row: dcast.data.table(dt, row~column, value.var="value", fill=NA) ## row parent.child1 parent.child2 parent.child3 ## 1: 1 ABC123 1381956044 4675 ## 2: 2 DEF456 1397527137 3735 ## 3: 3 NA NA 3735 I won't know ahead of time the names of the child elements, or the count of unique element names for children of the grandparent, so the answer should be flexible.

    Read the article

  • Selecting element by data attribute

    - by zizo
    Is there an easy and straight-forward method to select elements based on their data attribute? For example, select all anchors that has data attribute named customerID which has value of 22. I am kind of hesitant to use rel or other attributes to store such information, but I find it much harder to select an element based on what data is stored in it. Thanks!

    Read the article

  • R: convert data.frame columns from factors to characters

    - by Mike Dewar
    Hi, I have a data frame. Let's call him bob: > head(bob) phenotype exclusion GSM399350 3- 4- 8- 25- 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399351 3- 4- 8- 25- 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399352 3- 4- 8- 25- 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399353 3- 4- 8- 25+ 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399354 3- 4- 8- 25+ 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- GSM399355 3- 4- 8- 25+ 44+ 11b- 11c- 19- NK1.1- Gr1- TER119- I'd like to concatenate the rows of this data frame (this will be another question). But look: > class(bob$phenotype) [1] "factor" Bob's columns are factors. So, for example: > as.character(head(bob)) [1] "c(3, 3, 3, 6, 6, 6)" "c(3, 3, 3, 3, 3, 3)" [3] "c(29, 29, 29, 30, 30, 30)" I don't begin to understand this, but I guess these are indices into the levels of the factors of the columns (of the court of king caractacus) of bob? Not what I need. Strangely I can go through the columns of bob by hand, and do bob$phenotype <- as.character(bob$phenotype) which works fine. And, after some typing, I can get a data.frame whose columns are characters rather than factors. So my question is: how can I do this automatically? How do I convert a data.frame with factor columns into a data.frame with character columns without having to manually go through each column? Bonus question: why does the manual approach work?

    Read the article

  • Shipping Java code with data baked into the .jar

    - by Andrew
    I need to ship some Java code that has an associated set of data. It's a simulator for a device, and I want to be able to include all of the data used for the simulated records in the one .JAR file. In this case, each simulated record contains four fields (calling party, called party, start of call, call duration). What's the best way to do that? I've gone down the path of generating the data as Java statements, but IntelliJ doesn't seem particularly happy dealing with a 100,000 line Java source file! Is there a smarter way to do this? In the C#/.NET world I'd create the data as a separate file, embed it in the assembly as a resource, and then use reflection to pull that out at runtime and access it. I'm unsure of what the appropriate analogy is in the Java world. FWIW, Java 1.6, shipping for Solaris.

    Read the article

  • Get control instance in asp.net dynamic data

    - by Ashwani K
    Hello All: I am creating a web application using Asp.net dynamic data. I am using GridView to show data from the database. In the grid view I am having following code for columns <Columns> <asp:DynamicField DataField="UserId" UIHint="Label" /> <asp:DynamicField DataField="Address" UIHint="Address"/> <asp:DynamicField DataField="CreatedDate" UIHint="Label" /> </Columns> But, before displaying I want to do some processing in C# code for each row. In normal ASP.net grid view we can handle OnRowDataBound method, and using FindControl("controlid") we can get the control instance, but in case of dynamic data, I am not getting any id attribute for columns, so I am not able to get the control instance to show updated data in that control depending on some conditions. Thanks, Ashwani

    Read the article

  • Neo4j Reading data / performing shortest path calculations on stored data

    - by paddydub
    I'm using the Batch_Insert example to insert Data into the database How can i read this data back from the database. I can't find any examples of how i do this. public static void CreateData() { // create the batch inserter BatchInserter inserter = new BatchInserterImpl( "var/graphdb", BatchInserterImpl.loadProperties( "var/neo4j.props" ) ); Map<String,Object> properties = new HashMap<String,Object>(); properties.put( "name", "Mr. Andersson" ); properties.put( "age", 29 ); long node1 = inserter.createNode( properties ); properties.put( "name", "Trinity" ); properties.remove( "age" ); long node2 = inserter.createNode( properties ); inserter.createRelationship( node1, node2, DynamicRelationshipType.withName( "KNOWS" ), null ); inserter.shutdown(); } I would like to store graph data in the database, graph.makeEdge( "s", "c", "cost", (double) 7 ); graph.makeEdge( "c", "e", "cost", (double) 7 ); graph.makeEdge( "s", "a", "cost", (double) 2 ); graph.makeEdge( "a", "b", "cost", (double) 7 ); graph.makeEdge( "b", "e", "cost", (double) 2 ); Dijkstra<Double> dijkstra = getDijkstra( graph, 0.0, "s", "e" ); What is the best method to store this kind data with 10000's of edges. Then run the Dijskra algorighm to find shortest path calculations using the stored graph data.

    Read the article

  • Oracle Data Pump import to a sql file error :ORA-31655 no data or metadata objects

    - by Francisco Quiñones
    Hello, I'm using Data Pump to export/import data, one requirement is to import data to a sql file. The OS is window. I made the follow export : expdp system/password directory=dpump_dir dumpfile=tablesdump.dmp content=DATA_ONLY tables=user.tablename and it works, I can see the file TABLESDUMP.DMP in the directory path. then when I tried to import it to a sql file: impdp system/password directory=dpump_dir dumpfile=tablesdump.dmp sqlfile=tables_export.sql the log show : ..... ORA-31655 no data or metadata objects selected for job ..... and the sql file is created empty in the directory path. I'm not DBA, I'm a Java developer , Can you help me? Thks

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >