Search Results

Search found 67143 results on 2686 pages for 'complex data types'.

Page 35/2686 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • Get type of the parameter from list of objects, templates, C++

    - by CrocodileDundee
    This question follows to my previous question Get type of the parameter, templates, C++ There is the following data structure: Object1.h template <class T> class Object1 { private: T a1; T a2; public: T getA1() {return a1;} typedef T type; }; Object2.h template <class T> class Object2: public Object1 <T> { private: T b1; T b2; public: T getB1() {return b1;} } List.h template <typename Item> struct TList { typedef std::vector <Item> Type; }; template <typename Item> class List { private: typename TList <Item>::Type items; }; Is there any way how to get type T of an object from the list of objects (i.e. Object is not a direct parameter of the function but a template parameter)? template <class Object> void process (List <Object> *objects) { typename Object::type a1 = objects[0].getA1(); // g++ error: 'Object1<double>*' is not a class, struct, or union type } But his construction works (i.e. Object represents a parameter of the function) template <class Object> void process (Object *o1) { typename Object::type a1 = o1.getA1(); // OK }

    Read the article

  • Integrating Oracle Hyperion Smart View Data Queries with MS Word and Power Point

    - by Andreea Vaduva
    Untitled Document table { border: thin solid; } Most Smart View users probably appreciate that they can use just one add-in to access data from the different sources they might work with, like Oracle Essbase, Oracle Hyperion Planning, Oracle Hyperion Financial Management and others. But not all of them are aware of the options to integrate data analyses not only in Excel, but also in MS Word or Power Point. While in the past, copying and pasting single numbers or tables from a recent analysis in Excel made the pasted content a static snapshot, copying so called Data Points now creates dynamic, updateable references to the data source. It also provides additional nice features, which can make life easier and less stressful for Smart View users. So, how does this option work: after building an ad-hoc analysis with Smart View as usual in an Excel worksheet, any area including data cells/numbers from the database can be highlighted in order to copy data points - even single data cells only.   TIP It is not necessary to highlight and copy the row or column descriptions   Next from the Smart View ribbon select Copy Data Point. Then transfer to the Word or Power Point document into which the selected content should be copied. Note that in these Office programs you will find a menu item Smart View;from it select the Paste Data Point icon. The copied details from the Excel report will be pasted, but showing #NEED_REFRESH in the data cells instead of the original numbers. =After clicking the Refresh icon on the Smart View menu the data will be retrieved and displayed. (Maybe at that moment a login window pops up and you need to provide your credentials.) It works in the same way if you just copy one single number without any row or column descriptions, for example in order to incorporate it into a continuous text: Before refresh: After refresh: From now on for any subsequent updates of the data shown in your documents you only need to refresh data by clicking the Refresh button on the Smart View menu, without copying and pasting the context or content again. As you might realize, trying out this feature on your own, there won’t be any Point of View shown in the Office document. Also you have seen in the example, where only a single data cell was copied, that there aren’t any member names or row/column descriptions copied, which are usually required in an ad-hoc report in order to exactly define where data comes from or how data is queried from the source. Well, these definitions are not visible, but they are transferred to the Word or Power Point document as well. They are stored in the background for each individual data cell copied and can be made visible by double-clicking the data cell as shown in the following screen shot (but which is taken from another context).   So for each cell/number the complete connection information is stored along with the exact member/cell intersection from the database. And that’s not all: you have the chance now to exchange the members originally selected in the Point of View (POV) in the Excel report. Remember, at that time we had the following selection:   By selecting the Manage POV option from the Smart View meny in Word or Power Point…   … the following POV Manager – Queries window opens:   You can now change your selection for each dimension from the original POV by either double-clicking the dimension member in the lower right box under POV: or by selecting the Member Selector icon on the top right hand side of the window. After confirming your changes you need to refresh your document again. Be aware, that this will update all (!) numbers taken from one and the same original Excel sheet, even if they appear in different locations in your Office document, reflecting your recent changes in the POV. TIP Build your original report already in a way that dimensions you might want to change from within Word or Power Point are placed in the POV. And there is another really nice feature I wouldn’t like to miss mentioning: Using Dynamic Data Points in the way described above, you will never miss or need to search again for your original Excel sheet from which values were taken and copied as data points into an Office document. Because from even only one single data cell Smart View is able to recreate the entire original report content with just a few clicks: Select one of the numbers from within your Word or Power Point document by double-clicking.   Then select the Visualize in Excel option from the Smart View menu. Excel will open and Smart View will rebuild the entire original report, including POV settings, and retrieve all data from the most recent actual state of the database. (It might be necessary to provide your credentials before data is displayed.) However, in order to make this work, an active online connection to your databases on the server is necessary and at least read access to the retrieved data. But apart from this, your newly built Excel report is fully functional for ad-hoc analysis and can be used in the common way for drilling, pivoting and all the other known functions and features. So far about embedding Dynamic Data Points into Office documents and linking them back into Excel worksheets. You can apply this in the described way with ad-hoc analyses directly on Essbase databases or using Hyperion Planning and Hyperion Financial Management ad-hoc web forms. If you are also interested in other new features and smart enhancements in Essbase or Hyperion Planning stay tuned for coming articles or check our training courses and web presentations. You can find general information about offerings for the Essbase and Planning curriculum or other Oracle-Hyperion products here (please make sure to select your country/region at the top of this page) or in the OU Learning paths section , where Planning, Essbase and other Hyperion products can be found under the Fusion Middleware heading (again, please select the right country/region). Or drop me a note directly: [email protected] . About the Author: Bernhard Kinkel started working for Hyperion Solutions as a Presales Consultant and Consultant in 1998 and moved to Hyperion Education Services in 1999. He joined Oracle University in 2007 where he is a Principal Education Consultant. Based on these many years of working with Hyperion products he has detailed product knowledge across several versions. He delivers both classroom and live virtual courses. His areas of expertise are Oracle/Hyperion Essbase, Oracle Hyperion Planning and Hyperion Web Analysis.  

    Read the article

  • How can Core Data store an NSData?

    - by mystify
    The documentation says, that core data properties can only store NSString, NSNumber and NSDate types. However, a lot of Core Data users claim Core Data could also store an NSData type. But I wasn't able to see that in the documentation, although the Xcode Data Modeler allows to choose a data type called "binary" (which seems to be NSData). Did I miss something? Is there a hidden place in the documentation that indeed lists NSData for binary stuff?

    Read the article

  • Suggestion for a Data Structure!

    - by Jay
    I have the following requirements for a data structure: Direct access to an element with the help of a key (Key will be an integer, range is also same as integer range) Avoid memory allocation in chunks (Allocate contigous memory for the data structure including the data) Should be able to grow the data structure size dynamically Which data structure would you suggest? Any pointers in the direction will also be of help.

    Read the article

  • Finding easily parseable chemical element data

    - by nickname
    I am writing an application that needs simple data found (mostly) on the periodic table of elements, such as atomic mass, atomic number, state, etc. However, I would prefer not to manually enter this data. I managed to find the NIST website (http://www.nist.gov/pml/data/edi.cfm) with all of the data I need, but not in a downloadable format. Where can I find this data? Preferably, it would be in an XML/YAML/JSON/other documented format, however, any format would be helpful.

    Read the article

  • C# - Fast and simple multi dimensional data structures?

    - by Jeremy Rudd
    I need to store multi-dimensional data consisting of numbers in a manner thats easy to work with. I'm capturing data in real time, and once processed I would destroy and GC older data. This data structure must be fast so it won't hit my overall app performance. The faster the better. What are my choices in terms of platform supported data structures? I'm using VS 2010. and .NET 4.

    Read the article

  • New article available in "SOA Suite Essentials for WLI Users" series: Dynamic Data Lookup in a Busin

    - by simone.geib
    It is my pleasure to announce the publishing of another article in our "SOA Suite Essentials for WLI Users" series: "Dynamic Data Lookup in a Business Process: Meta Data Cache Control in Oracle WebLogic Integration and Domain Value Maps in SOA Suite". This article explains how dynamic data can be retrieved in a business process using Domain Value Maps in SOA Suite and shows the similarities to the WLI XML MetaData Cache Control. Lots of customers have asked about this comparison and I hope they will find it useful. The article follows "Setting Web Service and JCA Adapter Endpoints Dynamically in Oracle SOA Suite" which describes how web services and JCA adapter endpoints in SOA Suite can be changed at run-time, and so completes the use case where a BPEL process writes to a file (via file adapter) and the output directory and the file name are set dynamically. Please let me know what you think about the series and this specific article.

    Read the article

  • WP7 “Phantom Data” Source Possibly Revealed?

    - by Bil Simser
    Recently there’s been rumours floating around regarding “phantom” Windows Phone 7 data being magically sent and received on the latest WP7 phones. The news has mostly been floating around twitter so I didn’t pay it much attention. The BBC Technology News picked it up so I thought I would look more into it myself seeing that we have WP7 phones and maybe there was some truth to all this (and more importantly what was the cause). Full disclosure. I don’t have a lot of data points around this. This is from looking at a few phone logs, changing the configuration and looking back again after the change. I haven’t done a clean baseline test nor have I done testing with hundreds of phones. I leave the experience up to the reader to decide. So I went spelunking into the phone logs to see what was up. Most providers will show you data usage, at least on a daily basis. I lucked out with the provider and plan in that they provide hourly breakdowns. Here’s a snapshot from my usage throughout one night. Timestamp Data Usage 12:38:30 AM 2098 Kilobytes 1:30:30 AM 2 Kilobytes 2:38:30 AM 7118 Kilobytes 3:38:30 AM 6622 Kilobytes 4:38:30 AM 76 Kilobytes 5:38:30 AM 29 Kilobytes 6:38:30 AM 19 Kilobytes 7:38:30 AM 20 Kilobytes So a few observations from this data: Data seems to be collected on a regular basis. Looking at some other people phone logs, the times vary but it’s always hourly. There’s not a tremendous amount of data here (about 16 megabytes) but it seems like a lot for 7 hours The phone was connected to my home Wifi during this period Nothing was running and the phone was in a locked state Like I said, not a lot of data but it adds up. 16MB for 7 hours = about 50MB in a 24 hour period. That’s just plain old data being collected (somewhere, somehow) and not actual usage (Marketplace, Email, Browsing, etc.). Besides, when connected to a WiFi network you shouldn’t be charged data usage from your phone company (in theory, YMMV). After reviewing the logs I made a theory that the only thing that could possibly be sending data is the Feedback feature. With no other apps running under lock, what else could it be? In Windows 7 under your Settings the last option is Feedback. This sends feedback to Microsoft to “help improve Windows Phone”. On this page you have three options: Send feedback and use my cellular data connection Send feedback and (presumably) use my WiFi connection Don’t send feedback Knowing what I know about Microsoft, they do use the feedback data. For example some of the placement and inclusion of features in Office 2007 was based on that Feedback data that Office sends (assuming you had opted in). However in the Privacy Statement (it’s long but a good read at least once in your life), the Phone manual, and every other source I could look at there is no information about how much data it’s planning to send, just that it’s sending some data and that “some data charges with your carrier may apply”. Looking back at the logs, I have to wonder. 6MB at 3:30 and *then* 7MB the next hour. That’s a lot of information. And it adds up. 50MB in a 24 hour period X 30 days puts most people over a normal 1GB plan. And frankly why am I paying for a data plan only to have 80% of it chewed up by Microsoft, with no real benefit to me. If they included porn in the 50mb daily transfer I’d be okay with this, but I don’t see any new movies on my phone. So I turned it off. Set Feedback to disabled and wait. I waited. And waited. And generally didn’t use the phone if I could. The next day I went back to look at the data usage logs from the time period after turning the feedback mechanism off. Here are the results. Timestamp Data Usage 1:19:48 PM 0 Kilobytes 2:19:48 PM 0 Kilobytes 3:19:48 PM 0 Kilobytes 4:19:48 PM 678 Kilobytes (took a phone call) 5:19:48 PM 82 Kilobytes 6:19:48 PM 88 Kilobytes 7:20:30 PM 86 Kilobytes (guess they changed their reporting time) 8:20:30 PM 86 Kilobytes 9:20:30 PM 66 Kilobytes 10:20:30 PM 67 Kilobytes 11:20:30 PM 49 Kilobytes 12:20:30 AM 32 Kilobytes 1:20:30 AM 38 Kilobytes 2:20:31 AM 18 Kilobytes 3:20:31 AM 27 Kilobytes 4:20:31 AM 86 Kilobytes 5:20:31 AM 53 Kilobytes 6:20:31 AM 22 Kilobytes 7:22:15 AM 30 Kilobytes (another reporting time change) 8:22:15 AM 29 Kilobytes 9:22:15 AM 74 Kilobytes 10:22:15 AM 154 Kilobytes (phone call) 11:22:15 AM 12 Kilobytes 12:13:27 PM 49 Kilobytes 1:13:27 PM 197 Kilobytes (phone call) Quite a *drastic* change from what Feedback was turned on. I mean for a 24 hour period (sans 3 phone calls) I consumed about 1MB. Still quite a bit of transfer going on but at least it amounts to 30MB per month, not 30MB per day! Like I said this observation is neither scientific or conclusive. You decide what to do but frankly until Microsoft makes this data transfer exempt from your data plan (like that will happen) I would just turn Feedback off. YMMV.

    Read the article

  • How a .NET Programmer learn Big Data/Hadoop? [on hold]

    - by Smith Pascal Jr.
    I have been ASP.NET developer for sometime now and I have been reading a lot about Big Data- Hadoop and its future as to how it is the next technology in IT and how it would be useful to create million of jobs in US and elsewhere in the world. Now since Hadoop is an open source big data tool which is managed by Apache Server Foundation Group, I'm assuming I have to be well aware of JAVA - Correct me if I'm wrong. Moreover, How a .NET programmer can learn Big Data and its related technologies and can work professionally full time into this technology? What challenges and opportunities does a .NET professional face while changing the technology platform? Please advice. Thanks

    Read the article

  • post image and other data using mulipart form data in iphone

    - by abdulsamad
    Hi all I am sending some data and and an image to the server using multipart/form-data in objective C. kindly give me some Php code that how can i save the image on the server i am able to get the other variables on the server that i am passing with the image. kindly see my obj C code and php and tell me where i am wrong. your help will be highly appreciated. here i make the POST request. ////////////////////// NSString *stringBoundary, *contentType, *baseURLString, *urlString; NSData *imageData; NSURL *url; NSMutableURLRequest *urlRequest; NSMutableData *postBody; // Create POST request from message, imageData, username and password baseURLString = @"http://localhost:8888/Test.php"; urlString = [NSString stringWithFormat:@"%@", baseURLString]; url = [NSURL URLWithString:urlString]; urlRequest = [[[NSMutableURLRequest alloc] initWithURL:url] autorelease]; [urlRequest setHTTPMethod:@"POST"]; // Set the params NSString *path = [[NSBundle mainBundle] pathForResource:@"LibraryIcon" ofType:@"png"]; imageData = [[NSData alloc] initWithContentsOfFile:path]; // Setup POST body stringBoundary = [NSString stringWithString:@"0xKhTmLbOuNdArY"]; contentType = [NSString stringWithFormat:@"multipart/form-data; boundary=%@", stringBoundary]; [urlRequest addValue:contentType forHTTPHeaderField:@"Content-Type"]; // Setting up the POST request's multipart/form-data body postBody = [NSMutableData data]; [postBody appendData:[[NSString stringWithFormat:@"\r\n\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:@"Content-Disposition: form-data; name=\"source\"\r\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:@"lighttable"] dataUsingEncoding:NSUTF8StringEncoding]]; // So Light Table show up as source in Twitter post [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:@"Content-Disposition: form-data; name=\"title\"\r\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:book.title] dataUsingEncoding:NSUTF8StringEncoding]]; // title [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:@"Content-Disposition: form-data; name=\"isbn\"\r\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:book.isbn] dataUsingEncoding:NSUTF8StringEncoding]]; // isbn [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:@"Content-Disposition: form-data; name=\"price\"\r\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:txtPrice.text] dataUsingEncoding:NSUTF8StringEncoding]]; // Price [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:@"Content-Disposition: form-data; name=\"condition\"\r\n\r\n"] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithString:txtCondition.text] dataUsingEncoding:NSUTF8StringEncoding]]; // Price NSString *imageFileName = [NSString stringWithFormat:@"photo.jpeg"]; [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[[NSString stringWithFormat:@"Content-Disposition: form-data; name=\"upload\"; filename=\"%@\"\r\n",imageFileName] dataUsingEncoding:NSUTF8StringEncoding]]; //[postBody appendData:[[NSString stringWithFormat:@"Content-Disposition: form-data; name=\"upload\"\r\n\n\n"]dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:[@"Content-Type: image/jpeg\r\n\r\n" dataUsingEncoding:NSUTF8StringEncoding]]; [postBody appendData:imageData]; [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; // [postBody appendData:[[NSString stringWithFormat:@"\r\n--%@--\r\n", stringBoundary] dataUsingEncoding:NSUTF8StringEncoding]]; NSLog(@"postBody=%@", [[NSString alloc] initWithData:postBody encoding:NSASCIIStringEncoding]); [urlRequest setHTTPBody:postBody]; NSLog(@"Image data=%@",[[NSString alloc] initWithData:imageData encoding:NSASCIIStringEncoding]); // Spawn a new thread so the UI isn't blocked while we're uploading the image [NSThread detachNewThreadSelector:@selector(uploadingDataWithURLRequest:) toTarget:self withObject:urlRequest]; I the method uploadingDataWithURLRequest i post the request to the server... Here is my php Code ?php $title = $_POST['title']; $isbn = $_POST['isbn']; $price = $_POST['price']; $condition = $_POST['condition']; $image=$_FILES['image']['name']; if($image) { $filename = 'newimage.jpeg'; file_put_contents($filename, $image); echo "image is there"; } else { echo "image is nil"; } ?> I am unable to get the image on server kindly help me where i am wrong.

    Read the article

  • Excel 2010 data validation warning (compatibility mode)

    - by Madmanguruman
    We have some legacy worksheets that were created in Excel 2003, which are used by LabVIEW-based test automation software. The current LabVIEW software can only handle the legacy .xls format, so we're forced to keep these worksheets as-is for the time being. We've migrated to Office 2010 and when working with these worksheets, I see this warning: "The following features in this workbook are not supported by earlier versions of Excel. These features may be lost or degraded when you save this workbook in the currently selected file format. Click Continue to save the workbook anyway. To keep all of your features, click Cancel and then save the file in one of the new file formats." "Significant loss of functionality" "One or more cells in this workbook contain data validation rules which refer to values on other worksheets. These data validation rules will not be saved." When I click 'Find', some cells that do indeed have validation rules are highlighted, but those rules are all on the same worksheet! We're using simple list-based validation, with some cells off to the side containing the valid values (for example, cell B4 has a List with Source "=$D$4:$E$4") This makes no sense to me whatsoever. One, the workbook was created in Excel 2003, so obviously we couldn't implement a feature that doesn't exist. Secondly, the modifications we're making don't involve changing the validation rules at all. Thirdly, the complaint that Excel is making is incorrect! All of the rules are on the same worksheet as the target. As if the story wasn't bizarre enough: I went ahead and saved the worksheet with Excel 2010. I then went to an old computer back in the lab and opened the document with Excel 2003. Guess what - the validations were untouched! My questions are: is this a legitimate bug in Excel 2010, or is this some exotic error in the legacy .xls worksheet that is confusing the heck out of Excel 2010? Has anyone else observed this issue working in compatibility mode?

    Read the article

  • Multidimensional data table?

    - by ShreevatsaR
    [Apologies if this sort of question is off-topic for SuperUser. Please redirect to the right place if so.] There is a 3-dimensional array of values. (That is, instead of a table/2-dimensional array with values in a grid, the values can be thought of in a cube instead.) Is there a way to display this "cube" interactively, ideally on a webpage? Specifically, given the data, it would work something like this: the user selects two of the 3 variables. He then sees a "stack" of tables, one for each value of the third variable (cross-sections, in other words). By selecting the appropriate table from the stack, he can see the (i,j,k) value he wants. The "technology" for displaying such a thing (stacked tables, rotation, etc.) already exists, so this seems the sort of thing that someone ought to have written already. To be clear: I don't need sophisticated graphics necessarily, just the ability to select from cross-sections of variables. But I have no experience with (say, for displaying on a webpage) what web gadgets exist, so I'm clueless how to even search for one. (Google searches like "multidimensional data visualization" didn't throw up anything useful. Google Spreadsheets can do a few kinds of charts which can be embedded on a webpage, but I cannot tell if this is one of them.) [I can imagine how it ought to work for higher dimensions. For four-dimensions, instead of selecting just a stack, you'd first select an (i,j) from an "outer table", which would show all (k,l) values for that (i,j). For higher dimensions, inductively: you select (i,j), and then repeat what you'd do with 2 fewer dimensions.] So has this been written? Is this easy to write? Where ought one to look for such a thing?

    Read the article

  • Tools for displaying a multidimensional data table?

    - by ShreevatsaR
    [Apologies if this sort of question is off-topic for SuperUser. Please redirect to the right place if so.] There is a 3-dimensional array of values. (That is, instead of a table/2-dimensional array with values in a grid, the values can be thought of in a cube instead.) Is there a way to display this "cube" interactively, ideally on a webpage? Specifically, given the data, it would work something like this: the user selects two of the 3 variables. He then sees a "stack" of tables, one for each value of the third variable (cross-sections, in other words). By selecting the appropriate table from the stack, he can see the (i,j,k) value he wants. The "technology" for displaying such a thing (stacked tables, rotation, etc.) already exists, so this seems the sort of thing that someone ought to have written already. To be clear: I don't need sophisticated graphics necessarily, just the ability to select from cross-sections of variables. But I have no experience with (say, for displaying on a webpage) what web gadgets exist, so I'm clueless how to even search for one. (Google searches like "multidimensional data visualization" didn't throw up anything useful. Google Spreadsheets can do a few kinds of charts which can be embedded on a webpage, but I cannot tell if this is one of them.) [I can imagine how it ought to work for higher dimensions. For four-dimensions, instead of selecting just a stack, you'd first select an (i,j) from an "outer table", which would show all (k,l) values for that (i,j). For higher dimensions, inductively: you select (i,j), and then repeat what you'd do with 2 fewer dimensions.] So has this been written? Is this easy to write? Where ought one to look for such a thing?

    Read the article

  • Creating a test database with copied data *and* its own data

    - by Jordan Reiter
    I'd like to create a test database that each day is refreshed with data from the production database. BUT, I'd like to be able to create records in the test database and retain them rather than having them be overwritten. I'm wondering if there is a simple straightforward way to do this. Both databases run on the same server, so apparently that rules out replication? For clarification, here is what I would like to happen: Test database is created with production data I create some test records that I want to keep running on the test server (basically so I can have example records that I can play with) Next day, the database is completely refreshed, but the records I created that day are retained. Records that were untouched that day are replaced with records from the production database. The complication is if a record in the production database is deleted, I want it to be deleted on the test database too, so I do want to get rid of records in the test database that no longer exist in the production database, unless those records were created within the test database. Seems like the only way to do this would be to have some sort of table storing metadata about the records being created? So for example, something like this: CREATE TABLE MetaDataRecords ( id integer not null primary key auto_increment, tablename varchar(100), action char(1), pk varchar(100) ); DELETE FROM testdb.users WHERE NOT EXISTS (SELECT * from proddb.users WHERE proddb.users.id=testdb.users.id) AND NOT EXISTS (SELECT * from testdb.MetaDataRecords WHERE testdb.MetaDataRecords.pk=testdb.users.pk AND testdb.MetaDataRecords.action='C' AND testdb.MetaDataRecords.tablename='users' );

    Read the article

  • Import data in Excel that doesn't have a row delimiter, but number of columns is known

    - by Alex B
    So i have this text file that looks something like this: Header1 Header2 Header3 Header4 A1 B1 C1 D1 A2 B2 C2 D2 and so on. When imported, I'd want the data to format itself in 4 columns. I tried the Get External Data from Text, and it successfully imports it, but it doesn't wrap it around, so it just keeps making columns for every space. I'd want it to go on the next line after 4 (in this case) elements have been added. What's the simplest way to achieve this? EDIT: My answer follows, since I'm not yet allowed to answer my own questions yet. The Excel function I needed is called indirect(). Not sure how it actually works though, so hopefully someone can help out with that, but the function call that worked for me is =INDIRECT(ADDRESS((ROW(A1)-1)*4+COLUMN(A1),1)) which i found over here: http://www.ozgrid.com/forum/showthread.php?t=101584&p=456031#post456031 Note: this required me to add the text to excel where i'd get this row full of columns, and then flip it so that i'd have a column full of rows.

    Read the article

  • saving data from a failing drive

    - by intuited
    An external 3½" HDD seems to be in danger of failing — it's making ticking sounds when idle. I've acquired a replacement drive, and want to know the best strategy to get the data off of the dubious drive with the best chance of saving as much as possible. There are some directories that are more important than others. However, I'm guessing that picking and choosing directories is going to reduce my chances of saving the whole thing. I would also have to mount it, dump a file listing, and then unmount it in order to be able to effectively prioritize directories. Adding in the fact that it's time-consuming to do this, I'm leaning away from this approach. I've considered just using dd, but I'm not sure how it would handle read errors or other problems that might prevent only certain parts of the data from being rescued, or which could be overcome with some retries, but not so many that they endanger other parts of the drive from being saved. I guess ideally it would do a single pass to get as much as possible and then go back to retry anything that was missed due to errors. Is it possible that copying more slowly — e.g. pausing every x MB/GB — would be better than just running the operation full tilt, for example to avoid any overheating issues? For the "where is your backup" crowd: this actually is my backup drive, but it also contains some non-critical and bulky stuff, like music, that aren't backups, i.e. aren't backed up. The drive has not exhibited any clear signs of failure other than this somewhat ominous sound. I did have to fsck a few errors recently — orphaned inodes, incorrect free blocks/inodes counts, inode bitmap differences, zero dtime on deleted inodes; about 20 errors in all. The filesystem of the partition is ext3.

    Read the article

  • Recovering data from an external hard drive

    - by CCallaghan
    I have a WD Elements 2GB hard drive (formatted NTFS). I accidentally kicked out the USB cable while writing data to the disk, and now I can't access most of the data. Although this was ostensibly my backup drive, there is a great deal of important material on there which was only on there. I realise how idiotic this makes me. (So, formatting is not an option.) Things I've tried/information I've gathered: Windows Explorer will recognise the drive itself. However, it will not access most directories therein (and will sometimes crash when exploring). I can access all of the directories through the command line, but the dir command will often report that it can't read any files in most of the directories. The situation was similar when I hooked it up to an Ubuntu machine: the file explorer crashed, but I could access directories - but not files in those directories - via terminal commands. Several files I tried to copy out either resulted in an I/O error being reported or resulted in the command line crashing. The Disk Management utility on Windows reports a healthy disk formatted as NTFS and not RAW. It also indicates the correct amount of space used up and its capacity (so it seems that the files are not deleted). I've tried to run chkdsk, but that hangs on Step 2 (checking indexes) at 74%. Step 1 reported no bad sectors. I tried Recuva, but that didn't seem to work (stalled at 0% for half an hour). I should also note that the disk doesn't seem to be spinning smoothly; it seems to be chopping back, like it's reading the same sector over and over again. I noticed this after I kicked out the cable. Any help would be greatly appreciated. Update: It would seem the problem has taken a turn for the worse. The external hard drive now shows up on my computer as a local disk and is not mountable by Linux.

    Read the article

  • Data recovery on working hard drive

    - by emgee
    So I have a 5 bay hot swap SATA enclosure that's connected to a Silicon Image-based SATA adapter in a computer. It's running XP Pro. There are two 1.5TB hard drives in slots 1 and 2 respectively, set up using RAID 1 using the the Silicon Image utility. There are also two 1TB drives in bays 3 and 4, also set to RAID 1 the same way. The partitions for both RAID arrays are Dynamic partitions. A few days back, there was a bare hard drive that needed some files copied off of, so it was popped it in bay 5, that bay to pass-through, and the copied data off of it. Later, I noticed that my 1.5TB drives no longer showed up in windows. In the Silicon Image utility, the drives showed up fine, no error. However, in Device Manager, it shows the RAID 1 array as uninitialized. It shows up as the right size, etc., but nothing else. There's no sign of anything wrong with either drive, so I'm not sure what happened exactly. I'm not the only one who has access to that computer, so it is possible there is something else done to it that I don't know of. There's quite a lot of data on it still, and if at all possible, I'd prefer to not send it to Ontrack. Does anyone know of software that would restore the partitions, keeping in mind that it's a Windows LDM partition? I have access to a variety of Operating Systems, so something that would work on Mac, Windows or Linux would be acceptable. The programs I usually use are not compatible with LDM.

    Read the article

  • Easily Plotting Multiple Data Series in Excel

    - by John
    I really need help figuring out how to speed up graphing multiple series on a graph. I have seperate devices that give monthly readings for several variables like pressure, temperature, and salinity. Each of these variables is going to be its own graph with devices being the series. My x-axis is going to be the dates that these values were taken. The problem is that it takes ages to do this for each spreadsheet since I have monthly dates from 1950 up to the present and I have about 50 devices in each spreadsheet. I also have graphs for calculated values that are in columns next to them. Each of these devices is going to become a data series in the graph. E.g. In one of my graphs I have all the pressures from the devices and each of the data series' names is the name of the device. I want a fast way to do this. Doing this manually is taking a very long time. Please help! Is there any easier way to do this? It is consistent and the dates all line up. I am just repeating the same clicks over and over again Thank you!

    Read the article

  • Core Data Migration - "Can't add source store" error

    - by Tofrizer
    Hi, In my iPhone app I'm using Core Data and I've made changes to my data model that cannot be automatically migrated over (i.e. added new relationships). I added the data model version (Design - Data Model - Add Model Version) and applied my new data model changes to the new version 2. I then created a mapping object model and set the Source and Destination models to their correct data models (old and new respectively). When I run the app and call the persistentStoreCoordinator, my app barfs with the following: 2010-02-27 02:40:30.922 XXXX[73578:20b] Unresolved error Error Domain=NSCocoaErrorDomain Code=134110 UserInfo=0xfc2240 "Operation could not be completed. (Cocoa error 134110.)", { NSUnderlyingError = Error Domain=NSCocoaErrorDomain Code=134130 UserInfo=0xfbb3a0 "Operation could not be completed. (Cocoa error 134130.)"; reason = "Can't add source store"; } FWIW (not much i think) I've also made the usual code changes in persistentStoreCoordinator to use the NSMigratePersistentStoresAutomaticallyOption and NSInferMappingModelAutomaticallyOption (for future data model changes that can be automatically migrated). More relevantly, my managedObjectModel is created by calling initWithContentsOfURL where the file/resource type is "momd". I've tried updating both the source and destination model in the mapping model (Design - Mapping Model - Update XXX Model) as well as deleted the mapping model and recreated it. I've cleaned and re-built but all to no avail. I still get the above error message. Any pointers/thoughts on how I can further debug or resolve this problem please? I haven't posted any code snippets because this feels much more like a build environment issue (and my code is very standard - just the usual core data code to handle migrations using a mapping model but I'm happy to show the code if it helps). Appreciate any help. Thanks

    Read the article

  • iPhone and Core Data: how to retain user-entered data between updates?

    - by Shaggy Frog
    Consider an iPhone application that is a catalogue of animals. The application should allow the user to add custom information for each animal -- let's say a rating (on a scale of 1 to 5), as well as some notes they can enter in about the animal. However, the user won't be able to modify the animal data itself. Assume that when the application gets updated, it should be easy for the (static) catalogue part to change, but we'd like the (dynamic) custom user information part to be retained between updates, so the user doesn't lose any of their custom information. We'd probably want to use Core Data to build this app. Let's also say that we have a previous process already in place to read in animal data to pre-populate the backing (SQLite) store that Core Data uses. We can embed this database file into the application bundle itself, since it doesn't get modified. When a user downloads an update to the application, the new version will include the latest (static) animal catalogue database, so we don't ever have to worry about it being out of date. But, now the tricky part: how do we store the (dynamic) user custom data in a sound manner? My first thought is that the (dynamic) database should be stored in the Documents directory for the app, so application updates don't clobber the existing data. Am I correct? My second thought is that since the (dynamic) user custom data database is not in the same store as the (static) animal catalogue, we can't naively make a relationship between the Rating and the Notes entities (in one database) and the Animal entity (in the other database). In this case, I would imagine one solution would be to have an "animalName" string property in the Rating/Notes entity, and match it up at runtime. Is this the best way to do it, or is there a way to "sync" two different databases in Core Data?

    Read the article

  • Why is Scala very complex?

    - by Anantha Kumaran
    I am a student. I learned java during the 2nd year. Now i am in fourth year. I got bored with java and i started to learn Scala. As i learn it, i found it being very complex (although i love it). My question may apply to all new complex language. Why scala is complex? is it because we need to create complex softwares? or i am the only one who thinks it is complex?

    Read the article

  • What are the responsibilities of the data layer?

    - by alimac83
    I'm working on a project where I had to add a data layer to my application. I've always thought that the data layer is purely responsible for CRUD functions ie. shouldn't really contain any logic but should simply retrieve data for the business layer to manipulate. However I'm a little confused with my project because I'm not sure whether I've structured my app correctly for this scenario. Basically I'm trying to retrieve a list of products from the database that fall within a certain pricing threshold. At the moment I have a function in my data layer that basically returns all products where price min threshold and price < max threshold. But it got me thinking that maybe this is incorrect. Should the data layer simply return a list of ALL products and then the business logic do the filtering? I'm pretty confused over whether the data layer should simply provide methods that allow the business layer to get raw data or whether it should be responsible for getting filtered data too? If anyone has an article or something explaining this in detail it'd be very helpful. Thanks

    Read the article

  • How to store and remove dynamically and automatic variable of generic data type in custum list data

    - by Vineel Kumar Reddy
    Hi I have created a List data structure implementation for generic data type with each node declared as following. struct Node { void *data; .... .... } So each node in my list will have pointer to the actual data(generic could be anything) item that should be stored in the list. I have following signature for adding a node to the list AddNode(struct List *list, void* eledata); the problem is when i want to remove a node i want to free even the data block pointed by *data pointer inside the node structure that is going to be freed. at first freeing of datablock seems to be straight forward free(data) // forget about the syntax..... But if data is pointing to a block created by malloc then the above call is fine....and we can free that block using free function int *x = (int*) malloc(sizeof(int)); *x = 10; AddNode(list,(void*)x); // x can be freed as it was created using malloc what if a node is created as following int x = 10; AddNode(list,(void*)&x); // x cannot be freed as it was not created using malloc Here we cannot call free on variable x!!!! How do i know or implement the functionality for both dynamically allocated variables and static ones....that are passed to my list.... Thanks in advance...

    Read the article

  • Manually wiring up unobtrusive jquery validation client-side without Model/Data Annotations, MVC3

    - by cmorganmcp
    After searching and experimenting for 2 days I relent. What I'd like to do is manually wire up injected html with jquery validation. I'm getting a simple string array back from the server and creating a select with the strings as options. The static fields on the form are validating fine. I've been trying the following: var dates = $("<select id='ShiftDate' data-val='true' data-val-required='Please select a date'>"); dates.append("<option value=''>-Select a Date-</option>"); for (var i = 0; i < data.length; i++) { dates.append("<option value='" + data[i] + "'>" + data[i] + "</option>"); } $("fieldset", addShift).append($("<p>").append("<label for='ShiftDate'>Shift Date</label>\r").append(dates).append("<span class='field-validation-valid' data-valmsg-for='ShiftDate' data-valmsg-replace='true'></span>")); // I tried the code below as well instead of adding the data-val attributes and span manually with no luck dates.rules("add", { required: true, messages: { required: "Please select a date" } }); // Thought this would do it when I came across several posts but it didn't $.validator.unobtrusive.parse(dates.closest("form")); I know I could create a view model ,decorate it with a required attribute, create a SelectList server-side and send that, but it's more of a "how would I do this" situation now. Can anyone shed light on why the above code wouldn't work as I expect? -chad

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >