Search Results

Search found 83878 results on 3356 pages for 'google data api'.

Page 201/3356 | < Previous Page | 197 198 199 200 201 202 203 204 205 206 207 208  | Next Page >

  • Essence of BiMap in Google collections

    - by littleEinstein
    I am still quite puzzled at the BiMap in Google collections/Guava. It was claimed that The two bimaps are backed by the same data; any changes to one will appear in the other. I browsed through the source code, and I found the use of delegate in ForwardingMap. But in any actually subclass of StandardBiMap, I do see the data are put into both the forward and reverse map. So what is the essence, and why it claims to have saved space by keeping only one copy of the data? Is it just the actual objects are one set, but two distinct sets of references to these objects are still needed, one set maintained in forward map, and the other in reverse map? private V putInBothMaps(K key, V value, boolean force) { boolean containedKey = containsKey(key); if (containedKey && Objects.equal(value, get(key))) { return value; } if (force) { inverse().remove(value); } else if (containsValue(value)) { throw new IllegalArgumentException( "value already present: " + value); } V oldValue = super.put(key, value); updateInverseMap(key, containedKey, oldValue, value); return oldValue; }

    Read the article

  • Aging Data Structure in C#

    - by thelsdj
    I want a data structure that will allow querying how many items in last X minutes. An item may just be a simple identifier or a more complex data structure, preferably the timestamp of the item will be in the item, rather than stored outside (as a hash or similar, wouldn't want to have problems with multiple items having same timestamp). So far it seems that with LINQ I could easily filter items with timestamp greater than a given time and aggregate a count. Though I'm hesitant to try to work .NET 3.5 specific stuff into my production environment yet. Are there any other suggestions for a similar data structure? The other part that I'm interested in is aging old data out, If I'm only going to be asking for counts of items less than 6 hours ago I would like anything older than that to be removed from my data structure because this may be a long-running program.

    Read the article

  • Is it safe to convert a mysqlpp::sql_blob to a std::string?

    - by Runcible
    I'm grabbing some binary data out of my MySQL database. It comes out as a mysqlpp::sql_blob type. It just so happens that this BLOB is a serialized Google Protobuf. I need to de-serialize it so that I can access it normally. This gives a compile error, since ParseFromString() is not intended for mysqlpp:sql_blob types: protobuf.ParseFromString( record.data ); However, if I force the cast, it compiles OK: protobuf.ParseFromString( (std::string) record.data ); Is this safe? I'm particularly worried because of this snippet from the mysqlpp documentation: "Because C++ strings handle binary data just fine, you might think you can use std::string instead of sql_blob, but the current design of String converts to std::string via a C string. As a result, the BLOB data is truncated at the first embedded null character during population of the SSQLS. There’s no way to fix that without completely redesigning either String or the SSQLS mechanism." Thanks for your assistance!

    Read the article

  • Data mining logs to locate a bug

    - by gooli
    I'm working on a data distribution application which receives data from a source and distributes that data to multiple target application. After successfully distributing several messages each second for 8 days, it missed a single message and did not deliver it properly to the clients. As I was looking at the logs I tried to find something there that was special for the time the miss happend - either in the data, its rate or some other condition but couldn't find anything. Is there any data mining technique I can use to identify how that specific event differs from other events?

    Read the article

  • How to build a data flow?

    - by salvationishere
    I am running Visual Studio 2008, the SSIS Tutorial described on: http://msdn.microsoft.com/en-us/library/ms167106.aspx I finished all of the tasks but am getting following errors: Error 1 Validation error. Extract Sample Currency Data: Extract Sample Currency Data: input column "CurrencyAlternateKey" (123) has lineage ID 55 that was not previously used in the Data Flow task. Lesson 1.dtsx 0 0 Error 2 Validation error. Extract Sample Currency Data SSIS.Pipeline: input column "CurrencyAlternateKey" (123) has lineage ID 55 that was not previously used in the Data Flow task. Lesson 1.dtsx 0 0 Can you tell what I need to do to make this build without errors?

    Read the article

  • Using Google Weather API with Lat and Lon - how to format?

    - by Paul
    I am wanting to use the Google Weather API - by passing lat and long values. However it seems Google is needing these formatted differently to the values I have stored. i.e. For the town of McTavish I have values of 45.5 and -73.583 This works here: http://api.wunderground.com/auto/wui/geo/WXCurrentObXML/index.xml?query=45.5,-73.583 But when I use Google it does not: See: www.google.com/ig/api?weather=,,,45.5,-73.583 Any help appreciated. I would prefer to use the Google Data.

    Read the article

  • Parsing content-disposion header's filename in multipart/from-data

    - by Artyom
    Hello According to RFC, in multipart/form-data content-disposition header filename field receives as parameter HTTP quoted string - string between quites where character '\' can escape any other ascii character. Problem web browsers don't do it. IE6 sends: Content-Disposition: form-data; name="file"; filename="z:\tmp\test.txt" Instead of expected Content-Disposition: form-data; name="file"; filename="z:\\tmp\\test.txt" Which should be parsed as z:tmptest.txt according to rules instead of z:\tmp\test.txt. Firefox, Konqueror and Chrome don't escape " characters for example: Content-Disposition: form-data; name="file"; filename=""test".txt" Instead of expected Content-Disposition: form-data; name="file"; filename="\"test\".txt" So... how would you suggest to deal with this issue?

    Read the article

  • How to preserve data integrity while minimizing the transmission size

    - by user1500578
    we have sensors in the wild that send their data to a server every day via TCP/IP, either through 3G or through satellite for the physical layer. The sensors can automatically switch from one to the other depending on their location and the quality of the signal with the local 3G operator. Given that the 3G and satellite communications are very expensive, we want to minimize the amount of data to send. But also, we want to protect ourselves from lost data. What would be the best strategy to ensure with reasonable certainty that the integrity of our data is preserved, while minimizing the amount of redundancy, i.e the amount of data transmitted ? I've read about the zfec codec, but I'm not sure if we need to transmit all the chunks, or if we need to send a hash code along each chunk.

    Read the article

  • Last Observation Carried Forward In a data frame?

    - by Tal Galili
    Hi all, I wish to implement a "Last Observation Carried Forward" for a data set I am working on which has missing values at the end of it. Here is a simple code to do it (question after it): LOCF <- function(x) { # Last Observation Carried Forward (for a left to right series) LOCF <- max(which(!is.na(x))) # the location of the Last Observation to Carry Forward x[LOCF:length(x)] <- x[LOCF] return(x) } # example: LOCF(c(1,2,3,4,NA,NA)) LOCF(c(1,NA,3,4,NA,NA)) Now this works great for simple vectors. But if I where to try and use it on a data frame: a <- data.frame(rep("a",4), 1:4,1:4, c(1,NA,NA,NA)) a t(apply(a, 1, LOCF)) # will make a mess It will turn my data frame into a character matrix. Can you think of a way to do LOCF on a data.frame, without turning it into a matrix? (I could use loops and such to correct the mess, but would love for a more elegant solution) Cheers, Tal

    Read the article

  • Merging MySQL Structure and Data

    - by Shahid
    I have a MySQL database running on a deployment machine which also contains data. Then I have another MySQL database which has evolved in terms of STRUCTURE + DATA for some time. I need a way to merge the changes (ONLY) for both structure and data to the DB in deployment machine without disturbing the existing data. Does anyone know of a tool available which can do this safely. I have had a look at a few comparison tools but I need a tool which can automate the merge operation. Note also that most of the data in the tables is in BINARY so I can't use many file comparison tools. Does any one know of a solution to this? thanks

    Read the article

  • JavaScript data grid for millions of rows

    - by Rudiger
    I need to present millions of rows of data to users in a grid using JavaScript. I don't want the user thinking about viewing finite pieces of data; it should appear that all of the data is available. Rather than downloading all the data at once, small chunks are downloaded as the user comes to them (by scrolling, keying in row numbers, searching). The rows will not be edited through this front end, so read-only answers are acceptable. What JavaScript data grids are best for this kind of seamless paging?

    Read the article

  • Advice on simple efficient way to store web form data when no db/auth required

    - by ted776
    Hi, I have a situation where I need to provide an efficient way to process and store comments submitted via a web form. I would normally use PHP and either MySQL or XML to store the data, but this is slightly different in that this web form will only be temporarily available in a closed LAN environment, and all i need to do is process the form data and store it a format which can be accessed by another application on the LAN (Adobe Director). Each request made by the Director app should pop the stack of data. I'm wondering how best to store the data for this type of situation as it's not something I would normally do. I'm thinking possibly storing the data in an XML file, but any advice would be great!

    Read the article

  • Letting Wcf data service implement another contract

    - by Wasim
    Hi all , I have a wcf data service with the standart configuration . I want to add another functionality to it , so I built a contract interface and let my wcf data service implements it . Now I see in the service the InitializeService method , and the contract interface methods . When I come to connect the service , I get an error , that there is no end point declared to the contract I added . How can do that ? examples ? links ? I choosed to add the contract interface to the wcf data service and not adding another service , because the client application uses the wcf data service generated objects , and I want to use the same obkject to make operations not related to data , for more coplex processing . If I do the methods in another service , then I have types incompatibility . Thanks in advance ...

    Read the article

  • JavaScript - question regarding data structure

    - by orokusaki
    I'm trying to calculate somebody's bowel health based on a points system. I can edit the data structure, or logic in any way. I'm just trying to write a function and data structure to handle this ability. Pseudo calculator function: // Bowel health calculator var points = 0; If age is from 30 and 34: points += 1 If age is from 35 and 40: points += 2 If daily BMs is from 1 and 3: points -= 1 If daily BMs is from 4 and 6: points -= 2 return points; Pseudo data structure: var points_map = { age: { '30-34': 1, '35-40': 2 }, dbm: { '1-3': -1, '4-6': -2 } }; I have a full spreadsheet of data like this, and I am trying to write a DRY version of code and a DRY version of this data (ie, probably not a string for the '30-34', etc) in order to handle this sort of thing without a humongous number of switch statements.

    Read the article

  • Cloud HUGE data storage options?

    - by ToughPal
    Hi, Does anyone have a good suggestion on how to do video recording? We have a camera that can record and then stream live video to a server. So this means we can have 1000's of cameras sending data 24X7 for recording. We will store data for over 7 / 14 / 30 days depending on the package. Per day if a camera is sending data to the server then it will store 1.5GB. So that means there is a traffic of 1.5GB / day / camera Total monthly 45GB / month / camera (Data + bandwidth for one camera) Please let me know the most cost effective way to get this data stored? Thanks!

    Read the article

  • How to Embed/Link binary data into a C++ DLL

    - by CrimsonX
    So I have a Visual Studio 2008 project which has a large amount of binary data that it is currently referencing. I would like to package the binary data much like you can do with C# by adding it as a "resource" and compiling it as a DLL. Lets say all my data has an extension of ".data" and is currently being read from the visual studio project. Is there a way that you can compile or link the data into the .dll which it is calling? I've looked at some of the google link for this and so far I haven't come up with anything - the only possible solution I've come up with is to use something like ResGen to create a .resources file and then link it using AssemblyLinker with /Embed or /Link flags. I dont think it'd work properly though because I dont have text files to create the .resources files, but rather binary files themselves. Any advice?

    Read the article

  • Sort ranges in an array in google apps script

    - by user1637113
    I have a timesheet spreadsheet for our company and I need to sort the employees by each timesheet block (15 rows by 20 columns). I have the following code which I had help with, but the array quits sorting once it comes to a block without an employee name (I would like these to be shuffled to the bottom). Another complication I am having is there are numerous formulas in these cells and when I run it as is, it removes them. I would like to keep these intact if at all possible. Here's the code: function sortSections() { var activeSheet = SpreadsheetApp.getActiveSheet(); //SETTINGS var sheetName = activeSheet.getSheetName(); //name of sheet to be sorted var headerRows = 53; //number of header rows var pageHeaderRows = 5; //page totals to top of next emp section var sortColumn = 11; //index of column to be sorted by; 1 = column A var pageSize = 65; var sectionSize = 15; //number of rows in each section var col = sortColumn-1; var sheet = SpreadsheetApp.getActive().getSheetByName(sheetName); var data = sheet.getRange(headerRows+1, 1, sheet.getMaxRows()-headerRows, sheet.getLastColumn()).getValues(); var data3d = []; var dataLength = data.length/sectionSize; for (var i = 0; i < dataLength; i++) { data3d[i] = data.splice(0, sectionSize); } data3d.sort(function(a,b){return(((a[0][col]<b[0][col])&&a[0][col])?-1:((a[0][col]>b[0][col])?1:0))}); var sortedData = []; for (var k in data3d) { for (var l in data3d[k]) { sortedData.push(data3d[k][l]); } } sheet.getRange(headerRows+1, 1, sortedData.length, sortedData[0].length).setValues(sortedData);

    Read the article

  • Managing test data for Junit tests.

    - by nobody
    Hi, We are facing one problem in managing test data(xmls which is used to create mock objects). The data which we have currently has been evolved over a long period of time. Each time we add a new functionality or test case we add new data to test that functionality. Now, the problem is when the business requirement changes the format( like length or format of a variable) or any change which the test data doesn't support , we need to change the entire test data which is 100s of MBs in size. Could anyone suggest a better method or process to overcome this problem? Any suggestion would be appreciated.

    Read the article

  • Tools similar to HTTP Watch or YSlow for Google Chrome browser

    - by GustlyWind
    Hi We are testing our app in Google chrome for support. The basic loading is in scrambled and all the pages are in total CSS mess which we need to clean up unfortunately.For this I require a tool which works similar to firebug for Mozilla.Also to check headers, cookies, caching and POST data ,compression, redirection & chunked encoding a similar tool to HTTP watch is also desired. Any suggestions.Thanks

    Read the article

  • ASP.Net FileUpload not working in google chrome

    - by Jignesh
    ASP.Net FileUpload not working in google chrome.It shows validation error,even after choosing right file type.Any solution ? Here is a code : <asp:FileUpload ID="FU1" runat="server" /> <asp:RegularExpressionValidator id="FileUpLoadValidator" runat="server" ErrorMessage="Upload jpg and gif only." ValidationExpression="^(([a-zA-Z]:)|(\\{2}\w+)\$?)(\\(\w[\w].*))(.jpg|.JPG|.gif|.GIF)$" ControlToValidate="FU1"> </asp:RegularExpressionValidator>

    Read the article

  • google reader like architecture

    - by islam
    dear i want to make architect for application like google reader that save users feeds(rss,atoms) in database what is the best architecture i can do to make something like this ineed to know a good db desgin if i have to save something on files (xml or something) need to archive .etc hope to find some hints

    Read the article

< Previous Page | 197 198 199 200 201 202 203 204 205 206 207 208  | Next Page >