Search Results

Search found 7582 results on 304 pages for 'mod filter'.

Page 112/304 | < Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >

  • WP Admin: Filter for a Custom Taxonomy and see posts of any/all status ?

    - by tzeldin88
    On Admin Edit Posts (edit.php), how can I filter for a Custom Taxonomy and see posts of ANY status, not just Published? For example, say I have a Custom Taxonomy "Color"... These queries find posts of ANY status: edit.php?category_name=foo edit.php?author=3 edit.php?tag=foo And this query works correctly: edit.php?color=red&post_status=draft But this query finds ONLY Published posts: edit.php?color=red

    Read the article

  • CAML queries: how to filter folders from result set?

    - by drax
    Hi all, I'm using caml query to select all documents which were modified or added by user. Query runs recursively on all subsites of specified site collection. Now problem is I can't get rid of folders which are also part of result set. For now I'm filtering them from result datatable. But I'm wondering: Is it possible to filter out folders from result set just by using caml?

    Read the article

  • DRUPAL, Views: exposed filter.. how can I unselect all tags ?

    - by Patrick
    hi, I'm using Drupal, Views, tag exposed filter and I would like to allow my customer to select the default tags from back-end. However when he selects some tags is not possible anymore to unselect all of them. See initial picture: http://dl.dropbox.com/u/72686/Picture%201.png Now all the tags are unselected, but if I select just one of them, then I cannot anymore come back to the initial configuration. (at least one tag remains selected). How can I fix this ? thanks

    Read the article

  • Is there a way to filter out offensive words from Jcaptcha?

    - by elduff
    We are using JCaptcha for a captcha tool in a small app that my team is writing. However, just during development time (on a small team - 4 of us), we've run across a number of curse words and other potentially offensive words for the actual captchas. Is there a way to filter out potentially offensive words so that they are not presented to the user?

    Read the article

  • How to load image data from resource bitmap file for directshow filter ?

    - by Forrest
    I need put one bitmap image to my directshow filter. Then user can use this bitmap image and do not care where is it. First, I import this bitmap file into resource bundle, and get one IDB_BITMAP1. Then, I need to read this IDB_BITMAP1 using opencv cvLoadImage or some windows image API to load this image into buffer. So question is how to do this ? Or is that possible ? Thanks

    Read the article

  • In PHP, imagepng() accepts a filter parameter. How do these filters affect the function's output?

    - by Joe Lencioni
    How do these filters affect the output of imagepng() in PHP? PNG_NO_FILTER PNG_FILTER_NONE PNG_FILTER_SUB PNG_FILTER_UP PNG_FILTER_AVG PNG_FILTER_PAETH PNG_ALL_FILTERS The documentation simply says, "A special PNG filter, used by the imagepng() function" for each of them. It seems that using PNG_NO_FILTER will reduce the filesize of the output, but other than that, I am unsure as to how it is affected. Any insight would be really appreciated.

    Read the article

  • How can I filter a report with duplicate fields in related records?

    - by Graham Jones
    I have a report where I need to filter out records where there is a duplicate contract number within the same station but a different date. It is not considered a duplicate value becuase of the different date. I then need to summarize the costs and count the contracts but even if i suppress the "duplicate fields" it will summarize the value. I want to select the record with the most current date. Station Trans-DT Cost Contract-No 8 5/11/2010 10 5008 8 5/12/2010 15 5008 9 5/11/2010 12 5012 9 5/15/2010 50 5012

    Read the article

  • What's the non brute force way to filter a Python dictionary?

    - by Thierry Lam
    I can filter the following dictionary like: data = { 1: {'name': 'stackoverflow', 'traffic': 'high'}, 2: {'name': 'serverfault', 'traffic': 'low'}, 3: {'name': 'superuser', 'traffic': 'low'}, 4: {'name': 'mathoverflow', 'traffic': 'low'}, } traffic = 'low' for k, v in data.items(): if v['traffic'] == traffic: print k, v Is there an alternate way to do the above filtering?

    Read the article

  • How to filter to runing VLC stream? (FROM CMD)

    - by Ole Jak
    so.. I esely can broadcast my web cam with VLC using command lines like this (I use Windows) "C:\Program Files (x86)\VideoLAN\VLC\vlc.exe" -vvv -I --dshow-vdev="Logitech QuickCam Express / Go" dshow:// --sout When I paste command into CMD and hit enter it starts streaming (all is fine - I can play it) How can I now for example add brightness or any other filter to that stream from CMD?

    Read the article

  • Can an ActiveRecord before_save filter halt the save without halting the transaction?

    - by Michael Boutros
    Is there any way for a before_save filter to halt the entire save without halting the transaction? What I'm trying to do is have a "sample" version of my model that the user can interact with and save but the changes themselves are never actually saved. The following will halt the transaction and (naturally) return false when I call @model.update_attributes: before_filter :ignore_changes_if_sample def ignore_changes_if_sample if self.sample? return false end end Thanks!

    Read the article

  • [Django] How do I filter the choices in a ModelForm that has a CharField with the choices attribute

    - by nubela
    I understand I am able to filter queryset of Foreignkey or Many2ManyFields, however, how do I do that for a simple CharField that is a Select Widget (Select Tag). For example: PRODUCT_STATUS = ( ("unapproved", "Unapproved"), ("approved", "Listed"), #("Backorder","Backorder"), #("oos","Out of Stock"), #("preorder","Preorder"), ("userdisabled", "User Disabled"), ("disapproved", "Disapproved by admin"), ) and the Field: o_status = models.CharField(max_length=100, choices=PRODUCT_STATUS, verbose_name="Product Status", default="approved") Suppose I wish to limit it to just "approved" and "userdisabled" instead showing the full array (which is what I want to show in the admin), how do I do it? Thanks!

    Read the article

  • How can I test for an empty Breeze predicate?

    - by Megan
    I'm using Breeze to filter data requested on the client. My code looks a little like this: Client - Creating Filter Predicate var predicates = []; var criteriaPredicate = null; $.each(selectedFilterCriteria(), function (index, item) { criteriaPredicate = (index == 0) ? breeze.Predicate.create('criteriaId', breeze.FilterQueryOp.Equals, item) : criteriaPredicate.or('criteriaId', breeze.FilterQueryOp.Equals, item); if (breeze.Predicate.isPredicate(criteriaPredicate)) { predicates.push(criteriaPredicate); } // Repeat for X Filter Criteria var filter = breeze.Predicate.and(predicates); return context.getAll(filter, data); Client - Context Query function getAll(predicate, dataObservable) { var query = breeze.EntityQuery.from('Data'); if (breeze.Predicate.isPredicate(predicate)) { query = query.where(predicate); } return manager.executeQuery(query).then(success).fail(failure); } Issue I'm having an issue with the request because, if there are no filters set, I apply an "empty" predicate (due to the var filter = breeze.Predicate.and([]) line) resulting in a request like http://mysite/api/app/Data?$filter=. The request is an invalid OData query since the value of the $filter argument cannot be empty. Is there a good way for me to check for an empty predicate? I know I can refactor my client code to not use a predicate unless there is at least one filterable item, but I thought I would check first to see if I overlooked some property or method on the Breeze Predicate.

    Read the article

  • How can I refactor this to work without breaking the pattern horribly?

    - by SnOrfus
    I've got a base class object that is used for filtering. It's a template method object that looks something like this. public class Filter { public void Process(User u, GeoRegion r, int countNeeded) { List<account> selected = this.Select(u, r, countNeeded); // 1 List<account> filtered = this.Filter(selected, u, r, countNeeded); // 2 if (filtered.Count > 0) { /* do businessy stuff */ } // 3 if (filtered.Count < countNeeded) this.SendToSuccessor(u, r, countNeeded - filtered) // 4 } } Select(...), Filter(...) are protected abstract methods and implemented by the derived classes. Select(...) finds objects in the based on x criteria, Filter(...) filters those selected further. If the remaining filtered collection has more than 1 object in it, we do some business stuff with it (unimportant to the problem here). SendToSuccessor(...) is called if there weren't enough objects found after filtering (it's a composite where the next class in succession will also be derived from Filter but have different filtering criteria) All has been ok, but now I'm building another set of filters, which I was going to subclass from this. The filters I'm building however would require different params and I don't want to just implement those methods and not use the params or just add to the param list the ones I need and have them not used in the existing filters. They still perform the same logical process though. I also don't want to complicated the consumer code for this (which looks like this) Filter f = new Filter1(); Filter f2 = new Filter2(); Filter f3 = new Filter3(); f.Sucessor = f2; f2.Sucessor = f3; /* and so on adding filters as successors to previous ones */ foreach (User u in users) { foreach (GeoRegion r in regions) { f.Process(u, r, ##); } } How should I go about it?

    Read the article

  • Query optimization using composite indexes

    - by xmarch
    Many times, during the process of creating a new Coherence application, developers do not pay attention to the way cache queries are constructed; they only check that these queries comply with functional specs. Later, performance testing shows that these perform poorly and it is then when developers start working on improvements until the non-functional performance requirements are met. This post describes the optimization process of a real-life scenario, where using a composite attribute index has brought a radical improvement in query execution times.  The execution times went down from 4 seconds to 2 milliseconds! E-commerce solution based on Oracle ATG – Endeca In the context of a new e-commerce solution based on Oracle ATG – Endeca, Oracle Coherence has been used to calculate and store SKU prices. In this architecture, a Coherence cache stores the final SKU prices used for Endeca baseline indexing. Each SKU price is calculated from a base SKU price and a series of calculations based on information from corporate global discounts. Corporate global discounts information is stored in an auxiliary Coherence cache with over 800.000 entries. In particular, to obtain each price the process needs to execute six queries over the global discount cache. After the implementation was finished, we discovered that the most expensive steps in the price calculation discount process were the global discounts cache query. This query has 10 parameters and is executed 6 times for each SKU price calculation. The steps taken to optimise this query are described below; Starting point Initial query was: String filter = "levelId = :iLevelId AND  salesCompanyId = :iSalesCompanyId AND salesChannelId = :iSalesChannelId "+ "AND departmentId = :iDepartmentId AND familyId = :iFamilyId AND brand = :iBrand AND manufacturer = :iManufacturer "+ "AND areaId = :iAreaId AND endDate >=  :iEndDate AND startDate <= :iStartDate"; Map<String, Object> params = new HashMap<String, Object>(10); // Fill all parameters. params.put("iLevelId", xxxx); // Executing filter. Filter globalDiscountsFilter = QueryHelper.createFilter(filter, params); NamedCache globalDiscountsCache = CacheFactory.getCache(CacheConstants.GLOBAL_DISCOUNTS_CACHE_NAME); Set applicableDiscounts = globalDiscountsCache.entrySet(globalDiscountsFilter); With the small dataset used for development the cache queries performed very well. However, when carrying out performance testing with a real-world sample size of 800,000 entries, each query execution was taking more than 4 seconds. First round of optimizations The first optimisation step was the creation of separate Coherence index for each of the 10 attributes used by the filter. This avoided object deserialization while executing the query. Each index was created as follows: globalDiscountsCache.addIndex(new ReflectionExtractor("getXXX" ) , false, null); After adding these indexes the query execution time was reduced to between 450 ms and 1s. However, these execution times were still not good enough.  Second round of optimizations In this optimisation phase a Coherence query explain plan was used to identify how many entires each index reduced the results set by, along with the cost in ms of executing that part of the query. Though the explain plan showed that all the indexes for the query were being used, it also showed that the ordering of the query parameters was "sub-optimal".  Parameters associated to object attributes with high-cardinality should appear at the beginning of the filter, or more specifically, the attributes that filters out the highest of number records should be placed at the beginning. But examining corporate global discount data we realized that depending on the values of the parameters used in the query the “good” order for the attributes was different. In particular, if the attributes brand and family had specific values it was more optimal to have a different query changing the order of the attributes. Ultimately, we ended up with three different optimal variants of the query that were used in its relevant cases: String filter = "brand = :iBrand AND familyId = :iFamilyId AND departmentId = :iDepartmentId AND levelId = :iLevelId "+ "AND manufacturer = :iManufacturer AND endDate >= :iEndDate AND salesCompanyId = :iSalesCompanyId "+ "AND areaId = :iAreaId AND salesChannelId = :iSalesChannelId AND startDate <= :iStartDate"; String filter = "familyId = :iFamilyId AND departmentId = :iDepartmentId AND levelId = :iLevelId AND brand = :iBrand "+ "AND manufacturer = :iManufacturer AND endDate >=  :iEndDate AND salesCompanyId = :iSalesCompanyId "+ "AND areaId = :iAreaId  AND salesChannelId = :iSalesChannelId AND startDate <= :iStartDate"; String filter = "brand = :iBrand AND departmentId = :iDepartmentId AND familyId = :iFamilyId AND levelId = :iLevelId "+ "AND manufacturer = :iManufacturer AND endDate >= :iEndDate AND salesCompanyId = :iSalesCompanyId "+ "AND areaId = :iAreaId AND salesChannelId = :iSalesChannelId AND startDate <= :iStartDate"; Using the appropriate query depending on the value of brand and family parameters the query execution time dropped to between 100 ms and 150 ms. But these these execution times were still not good enough and the solution was cumbersome. Third and last round of optimizations The third and final optimization was to introduce a composite index. However, this did mean that it was not possible to use the Coherence Query Language (CohQL), as composite indexes are not currently supporte in CohQL. As the original query had 8 parameters using EqualsFilter, 1 using GreaterEqualsFilter and 1 using LessEqualsFilter, the composite index was built for the 8 attributes using EqualsFilter. The final query had an EqualsFilter for the multiple extractor, a GreaterEqualsFilter and a LessEqualsFilter for the 2 remaining attributes.  All individual indexes were dropped except the ones being used for LessEqualsFilter and GreaterEqualsFilter. We were now running in an scenario with an 8-attributes composite filter and 2 single attribute filters. The composite index created was as follows: ValueExtractor[] ve = { new ReflectionExtractor("getSalesChannelId" ), new ReflectionExtractor("getLevelId" ),    new ReflectionExtractor("getAreaId" ), new ReflectionExtractor("getDepartmentId" ),    new ReflectionExtractor("getFamilyId" ), new ReflectionExtractor("getManufacturer" ),    new ReflectionExtractor("getBrand" ), new ReflectionExtractor("getSalesCompanyId" )}; MultiExtractor me = new MultiExtractor(ve); NamedCache globalDiscountsCache = CacheFactory.getCache(CacheConstants.GLOBAL_DISCOUNTS_CACHE_NAME); globalDiscountsCache.addIndex(me, false, null); And the final query was: ValueExtractor[] ve = { new ReflectionExtractor("getSalesChannelId" ), new ReflectionExtractor("getLevelId" ),    new ReflectionExtractor("getAreaId" ), new ReflectionExtractor("getDepartmentId" ),    new ReflectionExtractor("getFamilyId" ), new ReflectionExtractor("getManufacturer" ),    new ReflectionExtractor("getBrand" ), new ReflectionExtractor("getSalesCompanyId" )}; MultiExtractor me = new MultiExtractor(ve); // Fill composite parameters.String SalesCompanyId = xxxx;...AndFilter composite = new AndFilter(new EqualsFilter(me,                   Arrays.asList(iSalesChannelId, iLevelId, iAreaId, iDepartmentId, iFamilyId, iManufacturer, iBrand, SalesCompanyId)),                                     new GreaterEqualsFilter(new ReflectionExtractor("getEndDate" ), iEndDate)); AndFilter finalFilter = new AndFilter(composite, new LessEqualsFilter(new ReflectionExtractor("getStartDate" ), iStartDate)); NamedCache globalDiscountsCache = CacheFactory.getCache(CacheConstants.GLOBAL_DISCOUNTS_CACHE_NAME); Set applicableDiscounts = globalDiscountsCache.entrySet(finalFilter);      Using this composite index the query improved dramatically and the execution time dropped to between 2 ms and  4 ms.  These execution times completely met the non-functional performance requirements . It should be noticed than when using the composite index the order of the attributes inside the ValueExtractor was not relevant.

    Read the article

< Previous Page | 108 109 110 111 112 113 114 115 116 117 118 119  | Next Page >