Search Results

Search found 8687 results on 348 pages for 'per'.

Page 14/348 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Database EAV model, record listing as per search

    - by Shyam Sunder Verma
    I am building a dynamic application. I have three tables : ( EAV model style) 1: Items ( ItemId, ItemName) 2: Fields (FieldId, FieldName) 3: Field Values ( ItemID, FieldId, Value) Can you tell me how to write SINGLE query to get starting 20 records from ALL items where FieldId=4 is equal to TRUE. Expected Result : Columns = ItemID | Name | Field1 | Field2 | Field3 Each Row= ItemId | ItemName| Value1 | Value2 | Value3 Important concerns : 1: Number of fields per item are not known 2: I need one to write ONE query. 3: Query will be running on 100K records, so performance is concern. 4: I am using MySQL 5.0, so need solution for MYSQL Should I denormalize the tables if above query is not possible at all ? Any advice ?

    Read the article

  • Get latest sql rows based on latest date and per user

    - by Umair
    I have the following table: RowId, UserId, Date 1, 1, 1/1/01 2, 1, 2/1/01 3, 2, 5/1/01 4, 1, 3/1/01 5, 2, 9/1/01 I want to get the latest records based on date and per UserId but as a part of the following query (due to a reason I cannot change this query as this is auto generated by a tool but I can write pass any thing starting with AND...): SELECT RowId, UserId, Date FROM MyTable WHERE 1 = 1 AND ( // everything which needs to be done goes here . . . ) I have tried similar query, but get an error: Only one expression can be specified in the select list when the subquery is not introduced with EXISTS.

    Read the article

  • how to have separate keys per record in mongo_mapper + Rails

    - by Vitaly Kushner
    When I'm adding a record in mongodb I can specify whatever keys I want and it will store it in the db. The problem is that it will remember those keys for the next time I insert another record. so for example if I do the following: Product.create :foo => 123 and then Product.create :bar => 456 I get :foo => nil field in the 2nd record. This is definitely not a limitation of mongodb itself, since if I restart the rails console and create yet another record with different set of columns, it will not add the columns from the 1st 2 records. So it seems like mongomapper remembers all the keys used and inserts them all into all records, even if values are not provided. The question is obviously: how do I disable this crazy attributes explosion? Basically I want only the 'permanent' keys that I specify in the model to be in every record, but all the 'extra' attributes to be specified per record and not to mess the consequent records.

    Read the article

  • Measuring CPU time per-thread on Windows

    - by Eli Courtwright
    I'm developing a long-running multi-threaded Python application for Windows, and I want the process to know the CPU time that each of its threads has taken. I can get the overall times for the entire process with os.times() but I need to know the per-thread times. I know that there are external tools such as the Sysinternals Process Explorer, but my program itself needs to have this information. If I were on Linux, I look in the /proc filesystem, as described here. If I were writing C code, I'd use the GetThreadTimes call, as described here. So how can I accomplish this on Windows using Python?

    Read the article

  • JSR 275 - Units, Percent per second

    - by I82Much
    Hi all, I need to represent the unit of Percent per second using the JScience.org's JSR 275 units and measures implementation. I am trying to do to the following: Unit<Dimensionless> PERCENT_PER_SECOND = NonSI.PERCENT.divide(Si.SECOND).asType(Dimensionless.class) but I am getting a ClassCastException when I try to do that. The following works, but I'm not sure if there's a better way: public interface PercentOverTime extends Quantity {} public static Unit<PercentOverTime> PERCENT_PER_SECOND = new BaseUnit<PercentOverTime>("%/s"); Any thoughts? The closest I could find to this is the question on Cooking Measurements (which is how I saw how to define your own units).

    Read the article

  • How to isolate data per customer, Django powered website

    - by Sawwy
    I have recently started learning python and django and working on a project that includes building a website for collecting information from customers. I am currently trying to figure out best way to isolate the customer data (collected information is sensitive and should only be accessible by customer and the service provider). I found this post Postgresql - one database for everyone, or one-database per customer and my question is that can I automate the model inheritance with customer creation via admin? To be specific, when save() is called for adding customer via django admin, this should create the customer specific tables (create a new set of tables with 'company_name' -prefix). For more information of the environment, I have extended the basic user registration with custom UserProfile adding 'company' and 'role' fields for each user. Upon login, the 'company' of the user will be checked to filter out tables without the 'company_name' prefix. 'Role' will further filter the which company-specific tables and set rights (view, edit). will appreciate any suggestions if more elegant methods could be used to solve the data isolation problem than model inheritance.

    Read the article

  • mysql : Recieve data only per months

    - by Tristan
    Hello, few times ago, i asked how to do to display datas per month, i must told a bad explanation because i just figured out that it's not what i want : Here's what I got : $req1 = ... AND v.date > (DATE_SUB(CURDATE(), INTERVAL 2 MONTH)) AND v.date < (DATE_SUB(CURDATE(), INTERVAL 1 MONTH)) $req2= ... AND v.date > (DATE_SUB(CURDATE(), INTERVAL 3 MONTH)) AND v.date < (DATE_SUB(CURDATE(), INTERVAL 2 MONTH)) But the problem, imagine that today you are the 10th june, it's going to calculate ALL the data between the 10 june to the 10 may then the 10 may until the 10 april... But what i want is data : from 1st may to 1 st june, from 1st june to 1st july... do you see what i mean ? thank you ;)

    Read the article

  • Limit number of views per day in Django

    - by ariddell
    Is there an easy way to limit the number of times a view can be accessed by a given IP address per day/week? A simplified version of the technique used by some booksellers to limit the number of pages of a book you can preview? There's only one view that this limit need apply to--i.e. it's not a general limit--and it would be nice if I could just have a variable overlimit in the template context. The solution need not be terribly robust, but limiting by IP address seemed like a better idea than using a cookie. I've looked into the session middleware but it doesn't make any references to tracking IP addresses as far as I can tell. Has anyone encountered this problem?

    Read the article

  • Table-per-type inheritance insert problem

    - by gzak
    I followed this article on making a table-per-type inheritance model for my entities, but I get the following error when I try to add an instance of a subclass to the database. Here is how I create the subtype: var cust = db.Users.CreateObject<Customer>(); // Customer inherits User db.Users.AddObject(cust); db.SaveChanges(); I get the following error when I make that last call: "A value shared across entities or associations is generated in more than one location. Check that mapping does not split an EntityKey to multiple store-generated columns." With the following inner exception: "An item with the same key has already been added." Any ideas on what I could be missing?

    Read the article

  • Django remove list_editable on a per row basis.

    - by Jason Leveille
    On the back of Django 1.2RC1, I have built a great administrator for my client! It's awesome. The client has one request that, if not satisfied, could bring this house of cards crashing down. I have a list of swim meet results. A meet result is added by a superuser. A team rep can edit a meet result (which they must be able to do inline, with list_editable), but after they edit the meet result inline and save, they should no longer be able to edit that result inline. They can only perform one edit. So, my question ... can I turn off list_editable on a per row basis?

    Read the article

  • Wireshark Plugin: Dissecting Payloads With Multiple Packets Per UDP Frame

    - by John Dibling
    I am writing a Wireshark plugin to dissect a protocol that places multiple application-level packets in a single UDP frame. There is no enclosing protocol that indicates how many packets there are in the frame. So essentially, the payload coming down the wire will look like this: uint64 sequence1 uint64 data1 uint8 flags1 uint64 sequence2 uint64 data2 uint8 flags2 : : : uint64 sequence_n uint64 data_n uint8 flags_n In my server code that actually processes this information, I simply loop through the frame until I reach the end. In looking through the plugins included with the wireshark source code, I didn't see any protocols that did any looping like this. I know other protocols pack multiple payloads per frame. What is the cannonical or standard way to handle protocols like this in a Wireshark dissector?

    Read the article

  • Apache setting mod_auth_ldap require settings per sub-directory

    - by Anthony
    I would like to set up a primary directory that has one set of LDAP-based restrictions and then have various sub-directories use other restrictions, but only have the actual LDAP search done in the base directory. For example: .htaccess per directory /Primary_Directory AuthLDAPURL "ldap://ldap1.airius.com:389/ou=People, o=Airius?uid?sub?(objectClass=*)" Require group cn=admins ../Open2All Require valid-user ../No_Admins_Allowed Require group cn!=admins So basically, the primary directory (in this example) can only be accessed by users who are in the admins group, while the first sub-directory can be accessed by anyone in the directory, and the second sub-folder can be reached by anyone who is NOT in the admin-group. But I only want to set the Require line for the sub-directories, and not re-setup the LDAP query on each sub-directory. Is this possible, even though there are clear permissions conflicts from level to level? Does the deepest .htaccess file know that the Require line refers to the LDAP search in the parent folder?

    Read the article

  • Multiple arrangements/asserts per unit test?

    - by lance
    A group of us (.NET developers) are talking unit testing. Not any one framework (we've hit on MSpec, NUint, MSTest, RhinoMocks, TypeMock, etc) -- we're just talking generally. We see lots of syntax that forces a distinct unit test per scenario, but we don't see an avenue to re-using one unit test with various inputs or scenarios. Also, we don't see an avenue to multiple asserts in a given test without an early assert's failure threatening the testing of later asserts (in the same test). Is there anything like that happening in .NET unit testing (state- or behavior-based) today?

    Read the article

  • Website access per client and each client having multiple users Sample Application

    - by windson
    I'm interested in building a web application in .NET that is scalable to multiple Clients and each and every Client has users associated with them. Suppose that my website is xyz.com and I have 3 clients "abc", "klm", "pqr" and I want to give access to features of xyz.com under the link as follows www.xyz.com/abc www.xyz.com/klm www.xyz.com/pqr and Client abc has N users and I want to set 3 roles for every client's user role. Is there any sample application in .NET that support this kind of website access per client having multiple users? And If I use ASP.NET Membership will that be a suitable membership solution or Do I need to opt for any other type of Membership defined by my own or already available in open source market for .NET. Edit: All the clients will have same functionality. I would like to build a generic model for www.xyz.com/{whatever} so that in future if a new client want to register with me he/she just have to give client name and up on adding client name all the features avaiable to exising clients will be applicable.

    Read the article

  • RSS feed per tag

    - by niaher
    Hi. Suppose stackoverflow.com wanted to have an RSS feed per each tag. They would probably have requests like stackoverflow.com/rss?tag=aspnet to return appropriate RSS feeds. This is the easy part. Now when the user requested stackoverflow.com/rss?tag=aspnet he would see some XML. Instead it would be better to show a page where user can choose which RSS reader he wants to subscribe with (just like feedburner.com). My question is: is there any ready-made code (html+javascript) that I can copy-paste to create such a subscription page? Basically I want to copy feedburner.com's subscription page onto my own site. PS - I would be happy using feedburner.com, but it would require me to create a feed for each tag manually, which is impractical.

    Read the article

  • Cache bandwidth per tick for modern CPUs

    - by osgx
    Hello What is a speed of cache accessing for modern CPUs? How many bytes can be read or written from memory every processor clock tick by Intel P4, Core2, Corei7, AMD? Please, answer with both theoretical (width of ld/sd unit with its throughput in uOPs/tick) and practical numbers (even memcpy speed tests, or STREAM benchmark), if any. PS it is question, related to maximal rate of load/store instructions in assembler. There can be theoretical rate of loading (all Instructions Per Tick are widest loads), but processor can give only part of such, a practical limit of loading.

    Read the article

  • Table per subclass inheritance relationship: How to query against the Parent class without loading a

    - by Arthur Ronald F D Garcia
    Suppose a Table per subclass inheritance relationship which can be described bellow (From wikibooks.org - see here) Notice Parent class is not abstract @Entity @Inheritance(strategy=InheritanceType.JOINED) public class Project { @Id private long id; // Other properties } @Entity @Table(name="LARGEPROJECT") public class LargeProject extends Project { private BigDecimal budget; } @Entity @Table(name="SMALLPROJECT") public class SmallProject extends Project { } I have a scenario where i just need to retrieve the Parent class. Because of performance issues, What should i do to run a HQL query in order to retrieve the Parent class and just the Parent class without loading any subclass ???

    Read the article

  • Java - Confused by the one class per file rule

    - by Mark
    The one class per file rule in Java has me a bit confused. I writing an Android app and trying to implement the accepted answer to this question: Common class for AsyncTask in Android? which calls for an interface definition which class A implements and class B accepts as an argument to its constructor. So I need an A.java and a B.java, but where does the interface go? Does it need a separate java file itself? Do I have to define it inside both A and B? If not how to import it? Also I will have about 10 different AsyncTask classes, but I don't want to bother creating a new file for each one. What would you recommend? Is there a way to put all 10 classes in one file? Or should I create a big if/then block inside the class and pass an argument telling it which of the 10 different tasks I want it to do?

    Read the article

  • Getting the most recent entry per group in a select statement

    - by TheObserver
    I have 3 tables to join to get table1.code, table1.series, table2.entry_date, table3.title1 and I'm trying to get the most recent non null table3.title1 grouped by table1.code and table1.series. select table1.code, table1.series, max(table2.entry_date), table3.Title1 from table3 INNER JOIN table2 ON table3.ID = table2.ID INNER JOIN table1 ON table2.source_code = table1.code where table3.Title1 is not NULL group by table1.code, table1.series, table3.Title1 seems to give me all entries with a non null title1 instead of the most recent one. How should I structure the query to just pick the newest version of Title1 per code & series?

    Read the article

  • Count number of messages per user

    - by Pr0no
    Consider the following tables: users messages ----------------- ----------------------- user_id messages msg_id user_id content ----------------- ----------------------- 1 0 1 1 foo 2 0 2 1 bar 3 0 3 1 foobar 4 3 baz 5 3 bar I want to count the number of messages per user and insert the outcome into users.messages, like this: users ----------------- user_id messages ----------------- 1 3 2 0 3 2 I could use PHP to perform this operation, pseudo: foreach ($user_id in users) { $count = select count(msg_id) from messages where user_id = $user_id update users set messages = $count } But this is probably very inefficient as compared to one query executed in MySQL directly: UPDATE users SET messages = ( SELECT COUNT(msg_id) FROM messages ) But I'm sure this is not a proper query. Therefore, any help would be appreciated :-)

    Read the article

  • The Scala way to use one actor per socket connection

    - by Stefan
    I am wondering how it is possible to avoid one socket connection pr. thread in Scala. I have thought a lot about it, but I always end up with some code which is listening for incoming data for each client connection. The problem is that I want to develop an application which should simultanously handle perhaps a couple of thousand connections. However I will of course not want to create a thread for each connection because of the lack of scalability and context switching. What would be the "right" way to do this. In my world it should be possible to have one actor for each connection without the need to block one thread per actor.

    Read the article

  • Why does Java ArrayList use per-element casting instead of per-array casting?

    - by user1111929
    What happens inside Java's ArrayList<T> (and probably many other classes) is that there is an internal Object[] array = new Object[n];, to which T Objects are written. Whenever an element is read from it, a cast return (T) array[i]; is done. So, a cast on every single read. I wonder why this is done. To me, it seems like they're just doing unnecessary casts. Wouldn't it be more logical and also slightly faster to just create a T[] array = (T[]) new Object[n]; and then just return array[i]; without cast? This is only one cast per array creation, which is usually far less than the number of reads. Why is their method to be preferred? I fail to see why my idea isn't strictly better?

    Read the article

  • How big can my SharePoint 2010 installation be?

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). 3 years ago, I had published “How big can my SharePoint 2007 installation be?” Well, SharePoint 2010 has significant under the covers improvements. So, how big can your SharePoint 2010 installation be? There are three kinds of limits you should know about Hard limits that cannot be exceeded by design. Configurable that are, well configurable – but the default values are set for a pretty good reason, so if you need to tweak, plan and understand before you tweak. Soft limits, you can exceed them, but it is not recommended that you do. Before you read any of the limits, read these two important disclaimers - 1. The limit depends on what you’re doing. So, don’t take the below as gospel, the reality depends on your situation. 2. There are many additional considerations in planning your SharePoint solution scalability and performance, besides just the below. So with those in mind, here goes.   Hard Limits - Zones per web app 5 RBS NAS performance Time to first byte of any response from NAS must be less than 20 milliseconds List row size 8000 bytes driven by how SP stores list items internally Max file size 2GB (default is 50MB, configurable). RBS does not increase this limit. Search metadata properties 10,000 per item crawled (pretty damn high, you’ll never need to worry about it). Max # of concurrent in-memory enterprise content types 5000 per web server, per tenant Max # of external system connections 500 per web server PerformancePoint services using Excel services as a datasource No single query can fetch more than 1 million excel cells Office Web Apps Renders One doc per second, per CPU core, per Application server, limited to a maximum of 8 cores.   Configurable Limits - Row Size Limit 6, configurable via SPWebApplication.MaxListItemRowStorage property List view lookup 8 join operations per query Max number of list items that a single operation can process at one time in normal hours 5000 Configurable via SPWebApplication.MaxItemsPerThrottledOperation   Also you get a warning at 3000, which is configurable via SPWebApplication.MaxItemsPerThrottledOperationWarningLevel   In addition, throttle overrides can be requested, throttle overrides can be disabled, and time windows can be set when throttle is disabled. Max number of list items for administrators that a single operation can process at one time in normal hours 20000 Configurable via SPWebApplication.MaxItemsPerThrottledOperationOverride Enumerating subsites 2000 Word and Powerpoint co-authoring simultaneous editors 10 (Hard limit is 99). # of webparts on a page 25 Search Crawl DBs per search service app 10 Items per crawl db 25 million Search Keywords 200 per site collection. There is a max limit of 5000, which can then be modified by editing the web.config/client.config. Concurrent # of workflows on a content db 15. Workflows running in the timer service are not counted in this limit. Further workflows are queued. Can be configured via the Set-SPFarmConfig powershell commandlet. Number of events picked by the workflow timer job and delivered to workflows 100. You can increase this limit by running additional instances of the workflow timer service. Visio services file size 50MB Visio web drawing recalculation timeout 120 seconds Configurable via – Powershell commandlet Set-SPVisioPerformance Visio services minimum and maximum cache age for data connected diagrams 0 to 24 hours. Default is 60 minutes. Configurable via – Powershell commandlet Set-SPVisioPerformance   Soft Limits - Content Databases 300 per web app Application Pools 10 per web server Managed Paths 20 per web app Content Database Size 200GB per Content DB Size of 1 site collection 100GB # of sites in a site collection 250,000 Documents in a library 30 Million, with nesting. Depends heavily on type and usage and size of documents. Items 30 million. Depends heavily on usage of items. SPGroups one SPUser can be in 5000 Users in a site collection 2 million, depends on UI, nesting, containers and underlying user store AD Principals in a SPGroup 5000 SPGroups in a site collection 10000 Search Service Instances 20 Indexed Items in Search 100 million Crawl Log entries 100 million Search Alerts 1 million per search application Search Crawled Properties 1/2 million URL removals in search 100 removals per operation User Profiles 2 million per service application Social Tags 500 million per social database Comment on the article ....

    Read the article

  • Wix create non advertised shortcut for all users / per machine

    - by mcdon
    In WIX, how do you create a non advertised shortcut in the allusers profile? So far I've only been able to accomplish this with advertised shortcuts. I prefer advertised shortcuts because you can go to the shortcut's properties and use "find target". In the tutorials I've seen use a registry value for the keypath of a shortcut. The problem is they use HKCU as the root. When HKCU is used, and another user uninstalls the program (since it's installed for all users) the registry key is left behind. When I use HKMU as the root I get an ICE57 error, but the key is removed when another user uninstalls the program. I seem to be pushed towards using HKCU though HKMU seems to behave correctly (per-user vs all-users). When I try to create the non advertised shortcut I get various ICE error such as ICE38, ICE43, or ICE 57. Most articles I've seen recommend "just ignore the ice errors". There must be a way to create the non advertised shortcuts, without creating ICE errors. Please post sample code for a working example.

    Read the article

  • LaTeX: bibliography per chapter.

    - by YuppieNetworking
    Hello all, I am helping a colleague with his PhD thesis and we need to present the bibliography at the end of each chapter. The question is: does anyone have a minimal working example for this case using latex+bibtex? The current document structure that we use is the following: main.tex chap1.tex chap2.tex ... chapn.tex biblio.bib Where main.tex contains packages, document declarations, macros and \includes for each chapter. biblio.bib is the only bibtex file (I think is easier to have all citations in one place). We have searched and tried with different latex packages, reading and following their documentation. Specifically, bibitems and chapterbib. bibitems successfully generates bu*.aux files, but when running bibtex for each one of them, an error occurs since there is no \bibdata element in the .aux file. chapterbib also generates a .aux file, but bibtex finishes with an error caused by using multiple \bibliography{file} in the .tex files (one per chapter). Some coworkers suggested using a separate bibtex file for each chapter, which could be a problem of maintenance in the future when citing the same publications in different chapters. We will like to continue having this document structure, if possible. So, if anyone could shed some light to this problem, we will appreciate it. Thanks.

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >