Search Results

Search found 64472 results on 2579 pages for 'data context'.

Page 238/2579 | < Previous Page | 234 235 236 237 238 239 240 241 242 243 244 245  | Next Page >

  • plotting results of hierarchical clustering ontop of a matrix of data in python

    - by user248237
    How can I plot a dendrogram right on top of a matrix of values, reordered appropriately to reflect the clustering, in Python? An example is in the bottom of the following figure: http://www.coriell.org/images/microarray.gif I use scipy.cluster.dendrogram to make my dendrogram and perform hierarchical clustering on a matrix of data. How can I then plot the data as a matrix where the rows have been reordered to reflect a clustering induced by the cutting the dendrogram at a particular threshold, and have the dendrogram plotted alongside the matrix? I know how to plot the dendrogram in scipy, but not how to plot the intensity matrix of data with the right scale bar next to it. Any help on this would be greatly appreciated.

    Read the article

  • Data import wizard library for .Net?

    - by Phil
    Does anyone know of a 3rd party data import wizard that can be embedded into applications? It should import from Excel, Access, SQLServer, csv, tab-separated flat file, XML, Oracle etc. We have a fixed data structure within our application and the user should be able to configure the wizard to match his/her import fields to our own data structure. The wizard should be a library of sorts – preferably a .Net type library. We may want to have it both web-based and desktop based (hence we may need an ASP.Net controls version and a Winforms version). We may also want integration with WPF and Silverlight. If there’s no UI wizard available, does anyone know of a non-UI library that supports easily configurable import from many, many different datasources?

    Read the article

  • String Field Sizes for unicode database fields using different data access components

    - by Serg
    mjustin in his question 1 and question 2 says that TWideStringField.Size property for UTF8 fields in Delphi 2009 dbExpress is 4 times larger than the logical field size (max number of characters in the field). I inclined to consider this a dbExpress bug. That is what Delphi 2009 Help says: The interpretation of Size depends on the data type. The meaning of Size for data types that use it is given in the following table. For all other data types, Size is not used and its value is always 0. ftString - Size is the maximum number of characters in the string. I am using FibPlus 6.9.9 and it follows the above documentation - the string field size is the maximum number of characters, not bytes. So the question also implies the following question: Are DbExpress drivers in Delphi 2009 unusable for unicode databases?

    Read the article

  • setIncludesSubentities: in an NSFetchRequest is broken for entities across multiple persistent store

    - by SG
    Prior art which doesn't quite address this: http://stackoverflow.com/questions/1774359/core-data-migration-error-message-model-does-not-contain-configuration-xyz I have narrowed this down to a specific issue. It takes a minute to set up, though; please bear with me. The gist of the issue is that a persistentStoreCoordinator (apparently) cannot preserve the part of an object graph where a managedObject is marked as a subentity of another when they are stored in different files. Here goes... 1) I have 2 xcdatamodel files, each containing a single entity. In runtime, when the managed object model is constructed, I manually define one entity as subentity of another using setSubentities:. This is because defining subentities across multiple files in the editor is not supported yet. I then return the complete model with modelByMergingModels. //Works! [mainEntity setSubentities:canvasEntities]; NSLog(@"confirm %@ is super for %@", [[[canvasEntities lastObject] superentity] name], [[canvasEntities lastObject] name]); //Output: "confirm Note is super for Browser" 2) I have modified the persistentStoreCoordinator method so that it sets a different store for each entity. Technically, it uses configurations, and each entity has one and only one configuration defined. //Also works! for ( NSString *configName in [[HACanvasPluginManager shared].registeredCanvasTypes valueForKey:@"viewControllerClassName"] ) { storeUrl = [NSURL fileURLWithPath:[[self applicationDocumentsDirectory] stringByAppendingPathComponent:[configName stringByAppendingPathExtension:@"sqlite"]]]; //NSLog(@"entities for configuration '%@': %@", configName, [[[self managedObjectModel] entitiesForConfiguration:configName] valueForKey:@"name"]); //Output: "entities for configuration 'HATextCanvasController': (Note)" //Output: "entities for configuration 'HAWebCanvasController': (Browser)" if (![persistentStoreCoordinator addPersistentStoreWithType:NSSQLiteStoreType configuration:configName URL:storeUrl options:options error:&error]) //etc 3) I have a fetchRequest set for the parent entity, with setIncludesSubentities: and setAffectedStores: just to be sure we get both 1) and 2) covered. When inserting objects of either entity, they both are added to the context and they both are fetched by the fetchedResultsController and displayed in the tableView as expected. // Create the fetch request for the entity. NSFetchRequest *fetchRequest = [[NSFetchRequest alloc] init]; [fetchRequest setEntity:entity]; [fetchRequest setIncludesSubentities:YES]; //NECESSARY to fetch all canvas types [fetchRequest setSortDescriptors:sortDescriptors]; [fetchRequest setFetchBatchSize:20]; // Set the batch size to a suitable number. [fetchRequest setAffectedStores:[[managedObjectContext persistentStoreCoordinator] persistentStores]]; [fetchRequest setReturnsObjectsAsFaults:NO]; Here is where it starts misbehaving: after closing and relaunching the app, ONLY THE PARENT ENTITY is fetched. If I change the entity of the request using setEntity: to the entity for 'Note', all notes are fetched. If I change it to the entity for 'Browser', all the browsers are fetched. Let me reiterate that during the run in which an object is first inserted into the context, it will appear in the list. It is only after save and relaunch that a fetch request fails to traverse the hierarchy. Therefore, I can only conclude that it is the storage of the inheritance that is the problem. Let's recap why: - Both entities can be created, inserted into the context, and viewed, so the model is working - Both entities can be fetched with a single request, so the inheritance is working - I can confirm that the files are being stored separately and objects are going into their appropriate stores, so saving is working - Launching the app with either entity set for the request works, so retrieval from the store is working - This also means that traversing different stores with the request is working - By using a single store instead of multiple, the problem goes away completely, so creating, storing, fetching, viewing etc is working correctly. This leaves only one culprit (to my mind): the inheritance I'm setting with setSubentities: is effective only for objects creating during the session. Either objects/entities are being stored stripped of the inheritance info, or entity inheritance as defined programmatically only applies to new instances, or both. Either of these is unacceptable. Either it's a bug or I am way, way off course. I have been at this every which way for two days; any insight is greatly appreciated. The current workaround - just using a single store - works completely, except it won't be future-proof in the event that I remove one of the models from the app etc. It also boggles the mind because I can't see why you would have all this infrastructure for storing across multiple stores and for setting affected stores in fetch requests if it by core definition (of setSubentities:) doesn't work.

    Read the article

  • Technology stack for very frequent gps data collection

    - by gvaswani
    I am working on a project that involves gps data collection from many users (say 1000) every second (while they move). I am planning on using a dedicated database instance on EC2 with the mysql persistent block storage and run a ruby on rails application with nginx frontend. I haven't worked on such data collection application before. Am I missing something here? I will have a another instance which will act as application server and use the data from the same EBS. If anybody has dealt with such a system before, Any advise would be much appreciated?

    Read the article

  • Java Google App Engine inconsistent data lose after restarting dev server

    - by user259349
    Hello everyone, I am using Java GAE. So far, i'm just scafolding my data objects and i'm seeing an interesting issue. The records that i am playing around with are getting updated properly as long as my dev server is running up. The second that the my dev server gets restarted, i lose all of my changes. That would be not alarming if i lost all of my records, but, there was a point of time where my data persisted through the server restart. I'm worried that i would lose production data if i launched without fixing this potential bugs? ANy idea on wher ei should look?

    Read the article

  • ..../All Users/Application data folder permissions

    - by Amit Kumar Jain
    I have a windows desktop application whose application data is stored in the All Users/Application Data/ My Company folder. Now when I install my application on an Windows XP machine using an Administrator login. If I run my application using that administrator's login it works well but when I tried to run my application using a normal users login on that machine it fails. The reason for failure is that the normal user is not able to write anything in the All Users/Application data/ My Company folder. Now is any kind of permission is required for All Users folder on Windows XP machine. If yes then from where I can set that permission.

    Read the article

  • How can I force asp.net webapi to always decode POST data as JSON

    - by Nathan Reed
    I getting some json data posted to my asp.net webapi, but the post parameter is always coming up null - the data is not being serialized correctly. The method looks something like this: public HttpResponseMessage Post(string id, RegistrationData registerData) It seems the problem is that the client (which I have no control over) is always sending the content-type as x-www-form-urlencoded, even though the content is actually json. This causes mvc to try to deserialize as form data, which fails. Is there anyway to get webapi to always deserialize as json, and to ignore the content-type header?

    Read the article

  • Reading Binary data from a Serial Port.

    - by rross
    I previously have been reading NMEA data from a GPS via a serial port using C#. Now I'm doing something similar, but instead of GPS from a serial. I'm attempting to read a KISS Statement from a TNC. I'm using this event handler. comport.DataReceived += new SerialDataReceivedEventHandler(port_DataReceived); Here is port_DataReceived. private void port_DataReceived(object sender, SerialDataReceivedEventArgs e) { string data = comport.ReadExisting(); sBuffer = data; try { this.Invoke(new EventHandler(delegate { ProcessBuffer(sBuffer); })); } catch { } } The problem I'm having is that the method is being called several times per statement. So the ProcessBuffer method is being called with only a partial statment. How can I read the whole statement?

    Read the article

  • Looking for the most painless non-RDBMS storage method in C#

    - by NateD
    I'm writing a simple program that will run entirely client-side. (Desktop programming? do people still do that?) and I need a simple way to store trivial amounts of data in a structured form, but really don't see any need to use a database system. What's more, some of the data needs to be serialized and passed around to different users, like some kind of "file" or perhaps a "document". (has anyone ever done that before?) So, I've looked at using .Net DataSets, LINQ, direct XML manipulation, and they all seem like they would get the job done, but I would like to know before I dive into any of them if there's one method that is generally regarded as easier to code than others. As I said, the amount of data to be stored is trivial, even if one hundred people all used the same machine we're not talking about more than 10 MB, so performance is not as large a concern as is codeability/maintainability. Thank you all in advance!

    Read the article

  • Implementing full text search on iPhone?

    - by Nimrod
    I'm looking for suggestions on the best way to implement a full-text search on some static data on the iPhone. Basically I have an app that contains the offline version of a web site, about 50MB of text, and I'd like for users to be able to search for terms. I figure that I should somehow build an table of ("word", reference_to_file_containing_word) or something, put that into either Core Data or just sqlite, index the "word" column, then have the search facility search the table for search terms and take the intersection of the sets of results for the terms or something. That wouldn't allow people to search for phrases but it would be pretty easy and probably not too slow. I'd like to just use existing SDK features for this. Should I use Core Data or sqlite? Does anyone have any other ideas on how this could be done?

    Read the article

  • How to synch SQLite data to server in iPhone

    - by crawler486
    Hello Guys, I'm a total noob when it comes to iPhone development and have been tasked to create a database app that can download and upload sqlite data to and from a server via http using web services. So far I already have a form for retrieving and saving data to a SQLite database and now I need some information on how I can upload the SQLite data to a server. The SQLite database will only have one table with 3 columns and about 200 rows max. I hope somebody can point me to the right direction or lead me to some sample codes. Appreciate any help.

    Read the article

  • DataGridView lags for a second with large data updates

    - by alexD
    I have a DataGridView with about 400 rows and 10 columns. When the user first displays this table, it receives all of the data from the server and populates the table. The DGV uses a DataTable as it's data source, and when updating the DataTable I use row.BeginEdit/EndEdit and acceptChanges, but when the View itself is updated it lags for a second while all of the DGV is being updated. I am wondering if there is a way to make this smooth, so that for example, if the user is scrolling through the data and it updates, it won't interrupt the scrolling. Or if the user is moving the display around the screen and it updates, it won't interrupt. Is there an easy way to do this? If not, is there away to prevent the DGV from updating the view until all events have ended so it won't be repainted until the user stops scrolling, dragging, etc ?

    Read the article

  • What language is to binary, as Perl is to text?

    - by ehdr
    I am looking for a scripting (or higher level programming) language (or e.g. modules for Python or similar languages) for effortlessly analyzing and manipulating binary data in files (e.g. core dumps), much like Perl allows manipulating text files very smoothly. Things I want to do include presenting arbitrary chunks of the data in various forms (binary, decimal, hex), convert data from one endianess to another, etc. That is, things you normally would use C or assembly for, but I'm looking for a language which allows for writing tiny pieces of code for highly specific, one-time purposes very quickly. Any suggestions?

    Read the article

  • Print ms access data in vb.net

    - by user225269
    How do I print the ms access data(.mdb) in vb.net? Here is the code that I'm using to view the data in the form. What I want to do is to be able to print what is currently being viewed. Perhaps automatically save the .pdf file and the pdf viewer installed on the system will open that newly generated pdf file Dim cn As New OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\search.mdb") Dim cmd As OleDbCommand = New OleDbCommand("Select * from GH where NAME= '" & TextBox6.Text & "' ", cn) cn.Open() Dim rdr As OleDbDataReader rdr = cmd.ExecuteReader If rdr.HasRows Then rdr.Read() NoAcc = rdr("NAME") If (TextBox6.Text = NoAcc) Then TextBox1.Text = rdr("IDNUMBER") If (TextBox6.Text = NoAcc) Then TextBox7.Text = rdr("DEPARTMENT") If (TextBox6.Text = NoAcc) Then TextBox8.Text = rdr("COURSE") End If -some sites for beginners regarding this topic would help a lot:)

    Read the article

  • how to get group total in self refrenced data in data table ?

    - by Nikhil Vaghela
    I have three columns in my data table. 1) ProductID 2) ProductParentID 3) ProductTotal ProductID and ProductParentID are self refrencing columns where i can set parent child relationship and get child rows based on my relationship. Let us say i have following data Product1     Product11     Product12     Product13         Product131         Product132         Product133 Product2     Product21     Product22     Product23 Next to above hierarchy in Product total column what i want is total of each child rows and sum of those child rows product total should be rolled up to it parent product. E.g if Product 131 total is 10,Product 13 total is 15 and Product 133 total is 5 then the product 13 total should be 30. The logic should work for n number of self hierarchy. Is there any functionality in data table itself where i can achieve this without iterating through each row and do it manually ? Thanks.

    Read the article

  • How/is data shared between fastCGI processes?

    - by Josh the Goods
    I've written a simple perl script that I'm running via fastCGI on Apache. The application loads a set of XML data files which are used to lookup values based upon the the query parameters of an incoming request. As I understand it, if I want to increase the amount of concurrent requests my application can handle I need to allow fastCGI to spawn multiple processes. Will each of these processes have to hold duplicate copies of the XML data in memory? Is there a way to set things up so that I can have one copy of the XML data loaded in memory while increasing the capacity to handle concurrent requests?

    Read the article

  • Is there a declarative language for data definitions?

    - by Jekke
    Reading about WPF and thinking about my application's data store at the same time led me to wonder if there are any languages or tools that allow you to define relational data in a declarative way? A shallow Google search suggests no such thing exists. Yet it seems so obviously useful. The kind of tool I have in mind would declaratively describe (at least) entities, relationships and views is a platform-agnostic way that would act as an abstraction layer between data-driven applications and their datastores. Does any such tool exist?

    Read the article

  • PostgreSQL 8.3 data types: xml vs varchar

    - by Sejanus
    There's xml data type in Postgres, I never used it before so I'd like to hear opinions. Downsides and upsides vs using regular varchar (or Text) column to store xml. The text I'm going to store is xml, well-formed, UTF-8. No need to search by it (I've read searching by xml is slow). This XML actually is data prepared for PDF generation with Apache FOP. XML can be generated dynamically from data found elsewhere (other Postgres tables), it's stored as is only so that I won't need to generate it twice. Kinda backup#2 for already generated PDF documents. Anything else to know? Good practices, performance, maintenance, etc?

    Read the article

  • Data Integration/EAI Project Lessons Learned

    - by Greg Harman
    Have you worked on a significant data or application integration project? I'm interested in hearing what worked for you and what didn't and how that affected the project both during and after implementation (i.e. during ongoing operation, maintenance and expansion). In addition to these lessons learned, please describe the project by including a quick overview of: The data sources and targets. Specifics are not necessary, but I'd like to know general technology categories e.g. RDBMS table, application accessed via a proprietary socket protocol, web service, reporting tool. The overall architecture of the project as related to data flows. Different human roles in the project (was this all done by one engineer? Did it include analysts with a particular expertise?) Any third-party products utilized, commercial or open source.

    Read the article

  • Cross Domain Post - Losing POST Data

    - by Tomas Beblar
    I have 2 servers, both running R2 / IIS7 / ASP Classic sites (can't get around any of that) Server A is making the follow calls: Dim objXMLHTTP, xml Set xml = Server.CreateObject("Msxml2.ServerXmlHTTP.6.0") xml.Open "POST", templateName, false xml.setRequestHeader "Content-Type", "application/xml" xml.Send variables Where the templateName is the URL of Server B (It's an email template) ... and variables are a name value pair string like a query string password=myPassword&customerEmail=Dear+Bob,.... Server B receives the POST but all the POST data (password=myPassword&customerEmail=Dear+Bob,....) is missing from the POST password = Request.Form("templatePassword") customerEmail = Request.Form("RackAttackCustomerEmail") The above values are all empty. Here's the kicker. This all worked on our old servers (Windows Server 2003, IIS 6) But when we migrated over, this stopped working correctly. My question is: What would cause the POST data to be dropped in IIS 7 when it all worked in IIS 6? I've done about 3 days of research into this trying many different things and nothing has worked. The POST data is just gone.

    Read the article

  • Can i write to data output stream after reading response from data input stream?

    - by Sirius
    Hi, i want to do a client-server activity like this: 1. first the client sends/writes to output stream 2. the server responses with some data that will be read with input stream 3. after receiving the data, the client sends/writes to output stream again to respond that the data has been received now, do i have to close the output stream and re-open it again before doing step no.3 ? also if someone could provide me with a snippet, it would be really helpful. thanks

    Read the article

  • Creating database desktop application with data manipulation in Netbeans using Java Persistence

    - by Lulu
    It's my first time to use Persistence in developing a Java program because I usually connect via JDBC. I read that for large amounts of data, it is best to use persistence. I tried playing with the CRUD example of Netbeans. It's not very helpful thought because it only connects to the DB and allows addition and deletion of records. I need something that will allow me to manipulate the data like if the value from column C1 of table T1 is such, it will retrieve data from table t2. In short, I need to apply conditions before knowing what to retrieve exactly. The example in CRUD example already has a specific table to retrieve and only acts like a database manager. How is it possible to retrieve a specific item first then from this, will determine the next steps to be done. I'm also using embedded JavaDB/Derby as my database (also my first time to use because I usually use remote mysql)

    Read the article

  • using the data-custom="" to bind to events

    - by Dean Peterson
    I'm pretty sure I'm gonna get slammed on this. I love using the data-whatever attribute to bind events to. It feels very clean to me and helps reserve my class attribute for just styling. I know this selector is among the slowest, so I don't use it when there are a lot of elements. Would love to hear compelling arguments against this. $("body").delegate("[data-action]", "click", function(){ var action = $(this).attr("data-action"); //route action to appropriate function });

    Read the article

  • sort data in c language

    - by ANIL MANE
    Hello C experts, I need little help on following requirement, as I know very little about C syntaxes. I have data in a file like this 73 54 57 [52] 75 73 65 [23] 65 54 57 [22] 22 59 71 [12] 22 28 54 [2] 65 22 54 73 [12] 65 28 54 73 [52] 22 28 65 73 [42] 65 54 57 73 [22] 22 28 54 73 [4] Where values in bracket denotes the occurrence of that series. I need to sort this data based on the occurrence of the data descending with maximum elements on the top as follows 65 28 54 73 [52] 22 28 65 73 [42] 65 54 57 73 [22] 65 22 54 73 [12] 22 28 54 73 [4] 28 59 71 [122] 73 54 57 [52] 22 28 65 [26] .. . . . and so on... Can someone give me a quick code for this. Thanks in advance.

    Read the article

< Previous Page | 234 235 236 237 238 239 240 241 242 243 244 245  | Next Page >