Search Results

Search found 3 results on 1 pages for 'jsonpath'.

Page 1/1 | 1 

  • Craft a JSONpath expression so that it retrieves only a specific value?

    - by Alex
    Hello, I have some JSON of which the following is a small sample: results": { "div": [ { "class": "sylEntry", "id": "sylOne", "div": [ { "class": "sT", "id": "sOT", "p": "Mon 11/17, Computer work time" }, { "class": "des", "id": "dOne", "p": "All classes Siebel 0218" } ] }, I would like to only retrieve the "p" content for the div element with class "sT". I would like to use a loop and doing something like this: var arrayOfResults = $.results..div.p does not work because I only want to retrieve the p value for the div element with class "sT". So how do I construct my JSONpath so that it will retrive the array of p elements that are contained within the divs class "sT". Thanks!!

    Read the article

  • Dojo Datagrid Filtering Issue

    - by Zoom Pat
    I am having hard time filtering a datagrid. Please help! This is how I draw a grid. var jsonStore = new dojo.data.ItemFileWriteStore({data:columnValues}); gridInfo = { store: jsonStore, queryOptions: {ignoreCase: true}, structure: layout }; grid = new dojox.grid.DataGrid(gridInfo, "gridNode"); grid.startup(); Now if i try something like this, it works fine and gives me the rows which has the column (AGE_FROM) value equal to 63. grid.filter({AGE_FROM:63}); but I need all kinds of filtering and not just 'equal to' So how do I try to obtain all the rows which have AGE_FROM 63, and < 63 and <= 63 and =63. because grid.filter({AGE_FROM:<63}); does not work Also One other way I was thingking was to use the following filteredStore = new dojox.jsonPath.query(filterData,"[?(@.AGE_FROM = 63]"); and then draw the grid with the filteredStore, but the above is not working for a != operator. Once I figure a good way to filter grid I need to see a way to filter out dates. I am trying to find a good example for filtering dataGrid but most of the examples are just filtering based on the 'equal to' criteria. Any help is highly appreciated.

    Read the article

  • What are good CLI tools for JSON?

    - by jasonmp85
    General Problem Though I may be diagnosing the root cause of an event, determining how many users it affected, or distilling timing logs in order to assess the performance and throughput impact of a recent code change, my tools stay the same: grep, awk, sed, tr, uniq, sort, zcat, tail, head, join, and split. To glue them all together, Unix gives us pipes, and for fancier filtering we have xargs. If these fail me, there's always perl -e. These tools are perfect for processing CSV files, tab-delimited files, log files with a predictable line format, or files with comma-separated key-value pairs. In other words, files where each line has next to no context. XML Analogues I recently needed to trawl through Gigabytes of XML to build a histogram of usage by user. This was easy enough with the tools I had, but for more complicated queries the normal approaches break down. Say I have files with items like this: <foo user="me"> <baz key="zoidberg" value="squid" /> <baz key="leela" value="cyclops" /> <baz key="fry" value="rube" /> </foo> And let's say I want to produce a mapping from user to average number of <baz>s per <foo>. Processing line-by-line is no longer an option: I need to know which user's <foo> I'm currently inspecting so I know whose average to update. Any sort of Unix one liner that accomplishes this task is likely to be inscrutable. Fortunately in XML-land, we have wonderful technologies like XPath, XQuery, and XSLT to help us. Previously, I had gotten accustomed to using the wonderful XML::XPath Perl module to accomplish queries like the one above, but after finding a TextMate Plugin that could run an XPath expression against my current window, I stopped writing one-off Perl scripts to query XML. And I just found out about XMLStarlet which is installing as I type this and which I look forward to using in the future. JSON Solutions? So this leads me to my question: are there any tools like this for JSON? It's only a matter of time before some investigation task requires me to do similar queries on JSON files, and without tools like XPath and XSLT, such a task will be a lot harder. If I had a bunch of JSON that looked like this: { "firstName": "Bender", "lastName": "Robot", "age": 200, "address": { "streetAddress": "123", "city": "New York", "state": "NY", "postalCode": "1729" }, "phoneNumber": [ { "type": "home", "number": "666 555-1234" }, { "type": "fax", "number": "666 555-4567" } ] } And wanted to find the average number of phone numbers each person had, I could do something like this with XPath: fn:avg(/fn:count(phoneNumber)) Questions Are there any command-line tools that can "query" JSON files in this way? If you have to process a bunch of JSON files on a Unix command line, what tools do you use? Heck, is there even work being done to make a query language like this for JSON? If you do use tools like this in your day-to-day work, what do you like/dislike about them? Are there any gotchas? I'm noticing more and more data serialization is being done using JSON, so processing tools like this will be crucial when analyzing large data dumps in the future. Language libraries for JSON are very strong and it's easy enough to write scripts to do this sort of processing, but to really let people play around with the data shell tools are needed. Related Questions Grep and Sed Equivalent for XML Command Line Processing Is there a query language for JSON? JSONPath or other XPath like utility for JSON/Javascript; or Jquery JSON

    Read the article

1