Search Results

Search found 9706 results on 389 pages for 'aggregate functions'.

Page 103/389 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • How to execute scalar function using Enterprise Library?

    - by Vadim
    I'm having trouble to execute scalar function using Enterprise Library 5.0. The code looks something like that: somedDb.ExecuteScalar(CommandType.Text, "SELECT dbo.MyFunction('param')"); When the code is executed, I get the following error: Cannot find either column "dbo" or the user-defined function or aggregate "dbo.MyFunction", or the name is ambiguous.

    Read the article

  • adding footer value to datagridview in C#

    - by Jankhana
    I am having 2 datatables. One is having the actual value of that table selected by the user and the next is having the aggregate value for that table i.e any grand total or avg. I want to display this as the footer in C# datagridview. How can I do that??? In asp.net we have RowDataBound event like that similar something is there in C# also but what it is i'm not able to find.

    Read the article

  • Best tool for DOM manipulation ?

    - by Olivier Lalonde
    I'm working on a web scraper which will aggregate data from various websites. I've started using PHP's built in DOM functions but after running into a couple of issues (especially regarding malformed markup and character encoding), I've chosen to ditch PHP. I was thinking of server side Javascript but am open to other suggestions. If I go with Javascript, which interpreter should I use?

    Read the article

  • Modeling distribution of performance measurements

    - by peterchen
    How would you mathematically model the distribution of repeated real life performance measurements - "Real life" meaning you are not just looping over the code in question, but it is just a short snippet within a large application running in a typical user scenario? My experience shows that you usually have a peak around the average execution time that can be modeled adequately with a Gaussian distribution. In addition, there's a "long tail" containing outliers - often with a multiple of the average time. (The behavior is understandable considering the factors contributing to first execution penalty). My goal is to model aggregate values that reasonably reflect this, and can be calculated from aggregate values (like for the Gaussian, calculate mu and sigma from N, sum of values and sum of squares). In other terms, number of repetitions is unlimited, but memory and calculation requirements should be minimized. A normal Gaussian distribution can't model the long tail appropriately and will have the average biased strongly even by a very small percentage of outliers. I am looking for ideas, especially if this has been attempted/analysed before. I've checked various distributions models, and I think I could work out something, but my statistics is rusty and I might end up with an overblown solution. Oh, a complete shrink-wrapped solution would be fine, too ;) Other aspects / ideas: Sometimes you get "two humps" distributions, which would be acceptable in my scenario with a single mu/sigma covering both, but ideally would be identified separately. Extrapolating this, another approach would be a "floating probability density calculation" that uses only a limited buffer and adjusts automatically to the range (due to the long tail, bins may not be spaced evenly) - haven't found anything, but with some assumptions about the distribution it should be possible in principle. Why (since it was asked) - For a complex process we need to make guarantees such as "only 0.1% of runs exceed a limit of 3 seconds, and the average processing time is 2.8 seconds". The performance of an isolated piece of code can be very different from a normal run-time environment involving varying levels of disk and network access, background services, scheduled events that occur within a day, etc. This can be solved trivially by accumulating all data. However, to accumulate this data in production, the data produced needs to be limited. For analysis of isolated pieces of code, a gaussian deviation plus first run penalty is ok. That doesn't work anymore for the distributions found above. [edit] I've already got very good answers (and finally - maybe - some time to work on this). I'm starting a bounty to look for more input / ideas.

    Read the article

  • LINQ To objects: Quicker ideas?

    - by SDReyes
    Do you see a better approach to obtain and concatenate item.Number in a single string? Current: var numbers = new StringBuilder( ); // group is the result of a previous group by var basenumbers = group.Select( item => item.Number ); basenumbers.Aggregate ( numbers, ( res, element ) => res.AppendFormat( "{0:00}", element ) );

    Read the article

  • Web Safe Area (optimal resolution) for web app design

    - by M.A.X
    I'm in the process of designing a new web app and I'm wondering for what 'web safe area' should I optimize the app layout and design. I did some investigation and thinking on my own but wanted to share this to see what the general opinion is. Here is what I found: Optimal Display Resolution: w3schools web stats seems to be the most referenced source (however they state that these are results from their site and is biased towards tech savvy users) http://www.w3counter.com/globalstats.php (aggregate data from something like 15,000 different sites that use their tracking services) StatCounter Global Stats Display Resolution (Stats are based on aggregate data collected by StatCounter on a sample exceeding 15 billion pageviews per month collected from across the StatCounter network of more than 3 million websites) NetMarketShare Screen Resolutions (marketshare.hitslink.com) (a web analytics consulting firm, they get data from browsers of site visitors to their on-demand network of live stats customers. The data is compiled from approximately 160 million visitors per month) Display Resolution Summary: There is a bit of variation between the above sources but in general as of Jan 2011 looks like 1024x768 is about 20%, while ~85% have a higher resolution of at least 1280x768 (1280x800 is the most common of these with 15-20% of total web, depending on the source; 1280x1024 and 1366x768 follow behind with 9-14% of the share). My guess would be that the higher resolution values will be even more common if we filter on North America, and even higher if we filter on N.American corporate users (unfortunately I couldn't find any free geographically filtered statistics). Another point to note is that the 1024x768 desktop user population is likely lower than the aforementioned 20%, seeing as the iPad (1024x768 native display) is likely propping up those number. My recommendation would be to optimize around the 1280x768 constraint (*note: 1280x768 is actually a relatively rare resolution, but I think it's a valid constraint range considering that 1366x768 is relatively common and 1280 is the most common horizontal resolution). Browser + OS Constraints: To further add to the constraints we have to subtract the space taken up by the browser (assuming IE, which is the most space consuming) and the OS (assuming WinXP-Win7): Win7 has the biggest taskbar footprint at a height of 40px (XP's and Vista's is 30px) The default IE8 view uses up 25px at the bottom of the screen with the status bar and a further 120px at the top of the screen with the windows title bar and the browser UI (assuming the default 'favorites' toolbar is present, it would instead be 91px without the favorites toolbar). Assuming no scrollbar, we also loose a total of 4px horizontally for the window outline. This means that we are left with 583px of vertical space and 1276px of horizontal. In other words, a Web Safe Area of 1276 x 583 Is this a correct line of thinking? I tried to Google some design best practices but most still talk about designing around 1024x768 which seems to be quickly disappearing. Any help on this would be greatly appreciated! Thanks.

    Read the article

  • Form generator and data capture PHP application

    - by Tom
    Hi, Does anyone know of a PHP open source which can generate forms to be deployed across your website. These forms will collect and aggregate the data with in on database. There should also be the functionality to search across the forms (to generate reports and newsletter mailing lists) All the services I have found so far have been hosted solutions. Thanks Tom

    Read the article

  • Custom aggregation in GROUP BY clause

    - by Rire1979
    If I have a table with a schema like this table(Category, SubCategory1, SubCategory2, Status) I would like to group by Category, SubCategory1 and aggregate the Status such that if not all Status values over the group have a certain value Status will be 0 otherwise 1. So my result set will look like (Category, SubCategory1, Status) I don't want to write a function. I would like to do it inside the query.

    Read the article

  • Good Domain Driven Design samples

    - by jlembke
    I'm learning about DDD and enjoying every minute of it. However, there are some practical issues that are confusing to me that I think seeing some good samples might clear up. So being at peace with those issues, does anyone know of some good working code samples that do a good job of modeling basic DDD concepts? Particularly interested in An illustrative Domain Model Repositories Use of Domain/Application Services Value Objects Aggregate Roots I know I'm probably asking for too much, but anything close will help.

    Read the article

  • Django select max id

    - by pistacchio
    Hi, given a standard model (called Image) with an autoset 'id', how do I get the max id? So far I've tried: max_id = Image.objects.all().aggregate(Max('id')) but I get a 'id__max' Key error. Trying max_id = Image.objects.order_by('id')[0].id gives a 'argument 2 to map() must support iteration' exception Any help?

    Read the article

  • linq get sum of two columns in one query

    - by Axarydax
    Hi, let's say that I have a table called Items (ID int, Done int, Total int) I can do it by two queries: int total = m.Items.Sum(p=>p.Total) int done = m.Items.Sum(p=>p.Done) But I'd like to do it in one query, something like this: var x = from p in m.Items select new { Sum(p.Total), Sum(p.Done)}; Surely there is a way to call aggregate functions from LINQ syntax...?

    Read the article

  • iPhone - packaging multiple app in a single app

    - by karim
    Hi, I would like to package multiple app in a single app. So donwloading one app and install that in an iPhone will install 3/4 apps. Something like java midlet suits having multiple Midlets in a single jar file. Is it possible by using multiple target or bundle, aggregate target etc.?

    Read the article

  • Get sum of two columns in one LINQ query

    - by Axarydax
    Hi, let's say that I have a table called Items (ID int, Done int, Total int) I can do it by two queries: int total = m.Items.Sum(p=>p.Total) int done = m.Items.Sum(p=>p.Done) But I'd like to do it in one query, something like this: var x = from p in m.Items select new { Sum(p.Total), Sum(p.Done)}; Surely there is a way to call aggregate functions from LINQ syntax...?

    Read the article

  • Choosing a CMS for an artist's site?

    - by shoosh
    I'm looking for a simple CMS for a site I'm building for my girlfriend. The requirements are very minimal Show images one by one, possibly with a line of text for each Show an aggregate gallery of say 4x4 images. Possibly have several different such galleries Customizable look so i could fit it to her mockup Any suggestions come to mind? Can wordpress do this?

    Read the article

  • LINQ Equivalent for Standard Deviation

    - by Steven
    Does LINQ model the aggregate SQL function STDDEV() (standard deviation)? If not, what is the simplest / best-practices way to calculate it? Example: SELECT test_id, AVERAGE(result) avg, STDDEV(result) std FROM tests GROUP BY test_id

    Read the article

  • Rails: three most recent comments with unique users

    - by Dennis Collective
    what would I put in the named scope :by_unique_users so that I can do Comment.recent.by_unique_users.limit(3), and only get one comment per user? class User has_many :comments end class Comment belongs_to :user named_scope :recent, :order => 'comments.created_at DESC' named_scope :limit, lambda { |limit| {:limit => limit}} named_scope :by_unique_users end on sqlite named_scope :by_unique_user, :group = "user_id" works, but makes it freak out on postgres, which is deployed on production PGError: ERROR: column "comments.id" must appear in the GROUP BY clause or be used in an aggregate function

    Read the article

  • (PHP) Validation, Security and Speed - Does my app have these?

    - by Devner
    Hi all, I am currently working on a building community website in PHP. This contains forms that a user can fill right from registration to lot of other functionality. I am not an Object-oriented guy, so I am using functions most of the time to handle my application. I know I have to learn OOPS, but currently need to develop this website and get it running soon. Anyway, here's a sample of what I let my app. do: Consider a page (register.php) that has a form where a user has 3 fields to fill up, say: First Name, Last Name and Email. Upon submission of this form, I want to validate the form and show the corresponding errors to the users: <form id="form1" name="form1" method="post" action="<?php echo $_SERVER['PHP_SELF']; ?>"> <label for="name">Name:</label> <input type="text" name="name" id="name" /><br /> <label for="lname">Last Name:</label> <input type="text" name="lname" id="lname" /><br /> <label for="email">Email:</label> <input type="text" name="email" id="email" /><br /> <input type="submit" name="submit" id="submit" value="Submit" /> </form> This form will POST the info to the same page. So here's the code that will process the POST'ed info: <?php require("functions.php"); if( isset($_POST['submit']) ) { $errors = fn_register(); if( count($errors) ) { //Show error messages } else { //Send welcome mail to the user or do database stuff... } } ?> <?php //functions.php page: function sql_quote( $value ) { if( get_magic_quotes_gpc() ) { $value = stripslashes( $value ); } else { $value = addslashes( $value ); } if( function_exists( "mysql_real_escape_string" ) ) { $value = mysql_real_escape_string( $value ); } return $value; } function clean($str) { $str = strip_tags($str, '<br>,<br />'); $str = trim($str); $str = sql_quote($str); return $str; } foreach ($_POST as &$value) { if (!is_array($value)) { $value = clean($value); } else { clean($value); } } foreach ($_GET as &$value) { if (!is_array($value)) { $value = clean($value); } else { clean($value); } } function validate_name( $fld, $min, $max, $rule, $label ) { if( $rule == 'required' ) { if ( trim($fld) == '' ) { $str = "$label: Cannot be left blank."; return $str; } } if ( isset($fld) && trim($fld) != '' ) { if ( isset($fld) && $fld != '' && !preg_match("/^[a-zA-Z\ ]+$/", $fld)) { $str = "$label: Invalid characters used! Only Lowercase, Uppercase alphabets and Spaces are allowed"; } else if ( strlen($fld) < $min or strlen($fld) > $max ) { $curr_char = strlen($fld); $str = "$label: Must be atleast $min character &amp; less than $max char. Entered characters: $curr_char"; } else { $str = 0; } } else { $str = 0; } return $str; } function validate_email( $fld, $min, $max, $rule, $label ) { if( $rule == 'required' ) { if ( trim($fld) == '' ) { $str = "$label: Cannot be left blank."; return $str; } } if ( isset($fld) && trim($fld) != '' ) { if ( !eregi('^[a-zA-Z0-9._-]+@[a-zA-Z0-9._-]+\.([a-zA-Z]{2,4})$', $fld) ) { $str = "$label: Invalid format. Please check."; } else if ( strlen($fld) < $min or strlen($fld) > $max ) { $curr_char = strlen($fld); $str = "$label: Must be atleast $min character &amp; less than $max char. Entered characters: $curr_char"; } else { $str = 0; } } else { $str = 0; } return $str; } function val_rules( $str, $val_type, $rule='required' ){ switch ($val_type) { case 'name': $val = validate_name( $str, 3, 20, $rule, 'First Name'); break; case 'lname': $val = validate_name( $str, 10, 20, $rule, 'Last Name'); break; case 'email': $val = validate_email( $str, 10, 60, $rule, 'Email'); break; } return $val; } function fn_register() { $errors = array(); $val_name = val_rules( $_POST['name'], 'name' ); $val_lname = val_rules( $_POST['lname'], 'lname', 'optional' ); $val_email = val_rules( $_POST['email'], 'email' ); if ( $val_name != '0' ) { $errors['name'] = $val_name; } if ( $val_lname != '0' ) { $errors['lname'] = $val_lname; } if ( $val_email != '0' ) { $errors['email'] = $val_email; } return $errors; } //END of functions.php page ?> OK, now it might look like there's a lot, but lemme break it down target wise: 1. I wanted the foreach ($_POST as &$value) and foreach ($_GET as &$value) loops to loop through the received info from the user submission and strip/remove all malicious input. I am calling a function called clean on the input first to achieve the objective as stated above. This function will process each of the input, whether individual field values or even arrays and allow only tags and remove everything else. The rest of it is obvious. Once this happens, the new/cleaned values will be processed by the fn_register() function and based on the values returned after the validation, we get the corresponding errors or NULL values (as applicable). So here's my questions: 1. This pretty much makes me feel secure as I am forcing the user to correct malicious data and won't process the final data unless the errors are corrected. Am I correct? Does the method that I follow guarantee the speed (as I am using lots of functions and their corresponding calls)? The fields of a form differ and the minimum number of fields I may have at any given point of time in any form may be 3 and can go upto as high as 100 (or even more, I am not sure as the website is still being developed). Will having 100's of fields and their validation in the above way, reduce the speed of application (say upto half a million users are accessing the website at the same time?). What can I do to improve the speed and reduce function calls (if possible)? 3, Can I do something to improve the current ways of validation? I am holding off object oriented approach and using FILTERS in PHP for the later. So please, I request you all to suggest me way to improve/tweak the current ways and suggest me if the script is vulnerable or safe enough to be used in a Live production environment. If not, what I can do to be able to use it live? Thank you all in advance.

    Read the article

  • Required help to Increase the performance of the MySQL query

    - by Joseph
    Hi all, I am using a following query in MySQl for fetching data from a table. Its taking too long because the conditional check within the aggregate function.Please help how to make it faster SELECT testcharfield ,SUM(IF (Type = 'pi',quantity, 0)) AS OB ,SUM(IF (Type = 'pe',quantity, 0)) AS CB FROM Table1 WHERE sequenceID = 6107 GROUP BY testcharfield

    Read the article

  • Is derived table executed once or three times?

    - by AspOnMyNet
    Every time you make use of a derived table, that query is going to be executed. When using a CTE, that result set is pulled back once and only once within a single query. Does the quote suggest that the following query will cause derived table to be executed three times ( once for each aggregate function’s call ): SELECT AVG(OrdersPlaced),MAX(OrdersPlaced),MIN(OrdersPlaced) FROM ( SELECT v.VendorID, v.[Name] AS VendorName, COUNT(*) AS OrdersPlaced FROM Purchasing.PurchaseOrderHeader AS poh INNER JOIN Purchasing.Vendor AS v ON poh.VendorID = v.VendorID GROUP BY v.VendorID, v.[Name] ) AS x thanx

    Read the article

  • SQL Update to the SUM if it's joined values

    - by CL4NCY
    Hi, I'm trying to update a field in the database to the sum of it's joined values: UPDATE P SET extrasPrice = SUM(E.price) FROM dbo.BookingPitchExtras AS E INNER JOIN dbo.BookingPitches AS P ON E.pitchID = P.ID AND P.bookingID = 1 WHERE E.[required] = 1 When I run this I get the following error: "An aggregate may not appear in the set list of an UPDATE statement." Any ideas?

    Read the article

  • SQL Alias as Table.column

    - by bakerjr
    hi, is it possible to alias an aggregate function in a select clause as AliasTable.AliasColumn? The reason is because I want minimum modifications in my arrays (I'm using PHP). Thanks! P.S. I'm using SQl Server 2008

    Read the article

  • Subrows into a row in GridView using WPF & C#

    - by toni
    hi! In my gridview I need to aggregate subrows into each row, something like in p2p emule/amule application where you can do double click to each file you are downloading and then under it you can see the parts of the file from where you are downloading. Is it possible in WPF? Thanks.

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >