Search Results

Search found 754 results on 31 pages for 'aggregate'.

Page 15/31 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Are there statistics or time series of open bugs in Ubuntu?

    - by aroque
    I would like to know how the number of bugs in Ubuntu (open, closed, critical, etc) has evolved with time. It's a sort of scientific curiosity I have, but it would also give me a feeling how the community has changed over time, how it has coped with the challenges (I think of Unity in particular) and what's its status now. Has anyone collected these data over the years? If yes, are they publicly available? I know this information can be gathered from Launchpad itself and actually I found a website that had data from mid 2008 to early 2009. I found Ubuntu live stats, which shows live messages related to Ubuntu, but does not aggregate bug statistics. Finally there are some stats on the Ubuntu Weekly Newsletter but they only show diffs of bugs closed during the last week.

    Read the article

  • Oracle Unveils Oracle Java Embedded Suite 7.0

    - by user12612705
    Today Oracle announced the Oracle Java Embedded Suite 7.0. What is the Java Embedded Suite (JES)? It's a middleware stack designed to be run on embedded devices. It's a suite which includes an application server (Glassfish for Embedded Suite), database (Java DB), and web services (Jersey Web Services Framework). Putting these services on the embedded device gives you the ability to provide a set of services at the device point. It also lets you aggregate data at the device point, which you can later sync with your enterprise systems.

    Read the article

  • Data architecture for event log metrics?

    - by elliot42
    My service has a large ongoing number of user events, and we would like to do things like "count occurrence of event type T since date D." We are trying to make two basic decisions: What to store? Storing every event vs. only storing aggregates (Event log style) log every event and count them later, vs. (Time-series style) store a single aggregated "count of event E for date D" for every day Where to store the data In a relational database (particularly MySQL) In a non-relational (NoSQL) database In flat log files (collected centrally over the network via syslog-ng) What is standard practice / where can I read more about comparing the different types of systems? Additional details: The total event stream is large, potentially hundreds of thousands of entries per day But our current need is only to count certain types of events within it We don't necessarily need real-time access to the raw data or aggregation results IMHO, "log all events to files, crawl them at a later time to filter and aggregate the stream" is a pretty standard UNIX Way, but my Rails-y compatriots seem to think that nothing is real unless it's in MySQL.

    Read the article

  • How to execute scalar function using Enterprise Library?

    - by Vadim
    I'm having trouble to execute scalar function using Enterprise Library 5.0. The code looks something like that: somedDb.ExecuteScalar(CommandType.Text, "SELECT dbo.MyFunction('param')"); When the code is executed, I get the following error: Cannot find either column "dbo" or the user-defined function or aggregate "dbo.MyFunction", or the name is ambiguous.

    Read the article

  • adding footer value to datagridview in C#

    - by Jankhana
    I am having 2 datatables. One is having the actual value of that table selected by the user and the next is having the aggregate value for that table i.e any grand total or avg. I want to display this as the footer in C# datagridview. How can I do that??? In asp.net we have RowDataBound event like that similar something is there in C# also but what it is i'm not able to find.

    Read the article

  • Best tool for DOM manipulation ?

    - by Olivier Lalonde
    I'm working on a web scraper which will aggregate data from various websites. I've started using PHP's built in DOM functions but after running into a couple of issues (especially regarding malformed markup and character encoding), I've chosen to ditch PHP. I was thinking of server side Javascript but am open to other suggestions. If I go with Javascript, which interpreter should I use?

    Read the article

  • Modeling distribution of performance measurements

    - by peterchen
    How would you mathematically model the distribution of repeated real life performance measurements - "Real life" meaning you are not just looping over the code in question, but it is just a short snippet within a large application running in a typical user scenario? My experience shows that you usually have a peak around the average execution time that can be modeled adequately with a Gaussian distribution. In addition, there's a "long tail" containing outliers - often with a multiple of the average time. (The behavior is understandable considering the factors contributing to first execution penalty). My goal is to model aggregate values that reasonably reflect this, and can be calculated from aggregate values (like for the Gaussian, calculate mu and sigma from N, sum of values and sum of squares). In other terms, number of repetitions is unlimited, but memory and calculation requirements should be minimized. A normal Gaussian distribution can't model the long tail appropriately and will have the average biased strongly even by a very small percentage of outliers. I am looking for ideas, especially if this has been attempted/analysed before. I've checked various distributions models, and I think I could work out something, but my statistics is rusty and I might end up with an overblown solution. Oh, a complete shrink-wrapped solution would be fine, too ;) Other aspects / ideas: Sometimes you get "two humps" distributions, which would be acceptable in my scenario with a single mu/sigma covering both, but ideally would be identified separately. Extrapolating this, another approach would be a "floating probability density calculation" that uses only a limited buffer and adjusts automatically to the range (due to the long tail, bins may not be spaced evenly) - haven't found anything, but with some assumptions about the distribution it should be possible in principle. Why (since it was asked) - For a complex process we need to make guarantees such as "only 0.1% of runs exceed a limit of 3 seconds, and the average processing time is 2.8 seconds". The performance of an isolated piece of code can be very different from a normal run-time environment involving varying levels of disk and network access, background services, scheduled events that occur within a day, etc. This can be solved trivially by accumulating all data. However, to accumulate this data in production, the data produced needs to be limited. For analysis of isolated pieces of code, a gaussian deviation plus first run penalty is ok. That doesn't work anymore for the distributions found above. [edit] I've already got very good answers (and finally - maybe - some time to work on this). I'm starting a bounty to look for more input / ideas.

    Read the article

  • LINQ To objects: Quicker ideas?

    - by SDReyes
    Do you see a better approach to obtain and concatenate item.Number in a single string? Current: var numbers = new StringBuilder( ); // group is the result of a previous group by var basenumbers = group.Select( item => item.Number ); basenumbers.Aggregate ( numbers, ( res, element ) => res.AppendFormat( "{0:00}", element ) );

    Read the article

  • Web Safe Area (optimal resolution) for web app design

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'web safe area' should I optimize the app layout and design. I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number. My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I tried to Google some design best practices but most still talk about designing around 1024x768 which seems to be quickly disappearing. Any help on this would be greatly appreciated! Thanks.

    Read the article

  • Form generator and data capture PHP application

    - by Tom
    Hi, Does anyone know of a PHP open source which can generate forms to be deployed across your website. These forms will collect and aggregate the data with in on database. There should also be the functionality to search across the forms (to generate reports and newsletter mailing lists) All the services I have found so far have been hosted solutions. Thanks Tom

    Read the article

  • Custom aggregation in GROUP BY clause

    - by Rire1979
    If I have a table with a schema like this table(Category, SubCategory1, SubCategory2, Status) I would like to group by Category, SubCategory1 and aggregate the Status such that if not all Status values over the group have a certain value Status will be 0 otherwise 1. So my result set will look like (Category, SubCategory1, Status) I don't want to write a function. I would like to do it inside the query.

    Read the article

  • Good Domain Driven Design samples

    - by jlembke
    I'm learning about DDD and enjoying every minute of it. However, there are some practical issues that are confusing to me that I think seeing some good samples might clear up. So being at peace with those issues, does anyone know of some good working code samples that do a good job of modeling basic DDD concepts? Particularly interested in An illustrative Domain Model Repositories Use of Domain/Application Services Value Objects Aggregate Roots I know I'm probably asking for too much, but anything close will help.

    Read the article

  • Django select max id

    - by pistacchio
    Hi, given a standard model (called Image) with an autoset 'id', how do I get the max id? So far I've tried: max_id = Image.objects.all().aggregate(Max('id')) but I get a 'id__max' Key error. Trying max_id = Image.objects.order_by('id')[0].id gives a 'argument 2 to map() must support iteration' exception Any help?

    Read the article

  • linq get sum of two columns in one query

    - by Axarydax
    Hi, let's say that I have a table called Items (ID int, Done int, Total int) I can do it by two queries: int total = m.Items.Sum(p=>p.Total) int done = m.Items.Sum(p=>p.Done) But I'd like to do it in one query, something like this: var x = from p in m.Items select new { Sum(p.Total), Sum(p.Done)}; Surely there is a way to call aggregate functions from LINQ syntax...?

    Read the article

  • iPhone - packaging multiple app in a single app

    - by karim
    Hi, I would like to package multiple app in a single app. So donwloading one app and install that in an iPhone will install 3/4 apps. Something like java midlet suits having multiple Midlets in a single jar file. Is it possible by using multiple target or bundle, aggregate target etc.?

    Read the article

  • Get sum of two columns in one LINQ query

    - by Axarydax
    Hi, let's say that I have a table called Items (ID int, Done int, Total int) I can do it by two queries: int total = m.Items.Sum(p=>p.Total) int done = m.Items.Sum(p=>p.Done) But I'd like to do it in one query, something like this: var x = from p in m.Items select new { Sum(p.Total), Sum(p.Done)}; Surely there is a way to call aggregate functions from LINQ syntax...?

    Read the article

  • Choosing a CMS for an artist's site?

    - by shoosh
    I'm looking for a simple CMS for a site I'm building for my girlfriend. The requirements are very minimal Show images one by one, possibly with a line of text for each Show an aggregate gallery of say 4x4 images. Possibly have several different such galleries Customizable look so i could fit it to her mockup Any suggestions come to mind? Can wordpress do this?

    Read the article

  • LINQ Equivalent for Standard Deviation

    - by Steven
    Does LINQ model the aggregate SQL function STDDEV() (standard deviation)? If not, what is the simplest / best-practices way to calculate it? Example: SELECT test_id, AVERAGE(result) avg, STDDEV(result) std FROM tests GROUP BY test_id

    Read the article

  • Rails: three most recent comments with unique users

    - by Dennis Collective
    what would I put in the named scope :by_unique_users so that I can do Comment.recent.by_unique_users.limit(3), and only get one comment per user? class User has_many :comments end class Comment belongs_to :user named_scope :recent, :order => 'comments.created_at DESC' named_scope :limit, lambda { |limit| {:limit => limit}} named_scope :by_unique_users end on sqlite named_scope :by_unique_user, :group = "user_id" works, but makes it freak out on postgres, which is deployed on production PGError: ERROR: column "comments.id" must appear in the GROUP BY clause or be used in an aggregate function

    Read the article

  • Required help to Increase the performance of the MySQL query

    - by Joseph
    Hi all, I am using a following query in MySQl for fetching data from a table. Its taking too long because the conditional check within the aggregate function.Please help how to make it faster SELECT testcharfield ,SUM(IF (Type = 'pi',quantity, 0)) AS OB ,SUM(IF (Type = 'pe',quantity, 0)) AS CB FROM Table1 WHERE sequenceID = 6107 GROUP BY testcharfield

    Read the article

  • Is derived table executed once or three times?

    - by AspOnMyNet
    Every time you make use of a derived table, that query is going to be executed. When using a CTE, that result set is pulled back once and only once within a single query. Does the quote suggest that the following query will cause derived table to be executed three times ( once for each aggregate function’s call ): SELECT AVG(OrdersPlaced),MAX(OrdersPlaced),MIN(OrdersPlaced) FROM ( SELECT v.VendorID, v.[Name] AS VendorName, COUNT(*) AS OrdersPlaced FROM Purchasing.PurchaseOrderHeader AS poh INNER JOIN Purchasing.Vendor AS v ON poh.VendorID = v.VendorID GROUP BY v.VendorID, v.[Name] ) AS x thanx

    Read the article

  • SQL Update to the SUM if it's joined values

    - by CL4NCY
    Hi, I'm trying to update a field in the database to the sum of it's joined values: UPDATE P SET extrasPrice = SUM(E.price) FROM dbo.BookingPitchExtras AS E INNER JOIN dbo.BookingPitches AS P ON E.pitchID = P.ID AND P.bookingID = 1 WHERE E.[required] = 1 When I run this I get the following error: "An aggregate may not appear in the set list of an UPDATE statement." Any ideas?

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >