Search Results

Search found 2156 results on 87 pages for 'weighted average'.

Page 56/87 | < Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >

  • In MongoDB, how can I replicate this simple query using map/reduce in ruby?

    - by Matthew Rathbone
    Hi, So using the regular MongoDB library in Ruby I have the following query to find average filesize across a set of 5001 documents: avg = 0 total = collection.count() Rails.logger.info "#{total} asset creation stats in the system" collection.find().each {|row| avg += (row["filesize"] * (1/total.to_f)) if row["filesize"]} Its pretty simple, so I'm trying to do the same using map/reduce as a learning exercise. This is what I came up with: map = 'function(){emit("filesizes", {size: this.filesize, num: 1});}' reduce = 'function(k, vals){ var result = {size: 0, num: 0}; for(var x in vals) { var new_total = result.num + vals[x].num; result.num = new_total result.size = result.size + (vals[x].size * (vals[x].num / new_total)); } return result; }' @results = collection.map_reduce(map, reduce) However the two queries come back with two different results! What am I doing wrong?

    Read the article

  • Getting pixel averages of a vector sitting atop a bitmap...

    - by user346511
    I'm currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I'd like to overlay a triangle vector onto an image and get the average pixel data within the bounds of that vector. However, I'm unfamiliar with the math needed to calculate this. Does anyone have an algorithm or a link that could send me in the right direction? (I tagged this as Python, which is preferred, but I'd be happy with the general algorithm!) I've created a basic image of what I'm trying to capture here: http://imgur.com/Isjip.gif

    Read the article

  • Python / Django : emulating a multidimensionnal layer on a mySql database

    - by Sébastien Piquemal
    Hi, I'm working on a Django project where I need to provide a lot of different visualizations on the same data (for example average of a value for each month, for each year / for a location, etc ...). I have been using OLAP database once in college, and I thought that it would fit my needs, but it appears that it is much to heavy for what I need. Actually the volume of data is not very big, so I don't need any optimization, just a way to present different visualizations of the same data without having to write 1000 times the same code. So let's recap : I need a python library : to emulate a multidimensional database (OLAP style would be nice because I think it is quite convenient : stat structure, and everything) non-intrusive, because I can't modify anything on the existing mysql database easy-to-use, because otherwise there's no point in replacing some overhead by another.

    Read the article

  • Averaging corrupted images to eliminate the noise in Matlab

    - by Mertie Pertie
    Hi all As you can get it from the title, I want to average some .jpg images which are corrupted by zero-mean Gaussian additive. After searching over internet, I figured out to add image matrices and divide the sum by the # of matrices. However the resultant image is totally black. Normally when the number of image increases then the resultant image gets better. But When I use more images it gets darker. I am using 800x600 black and white images with .jpg ext Here is the script I used; image1 = imread ('PIC1.jpg'); image2 = imread ('PIC2.jpg'); image3 = imread ('PIC3.jpg'); image4 = imread ('PIC4.jpg'); sum = image1 + image2 + image3 + image4; av = sum / 4; imshow(av); Thanks in advance

    Read the article

  • Huge page buffer vs. multiple simultaneous processes

    - by Andrei K.
    One of our customer has a 35 Gb database with average active connections count about 70-80. Some tables in database have more than 10M records per table. Now they have bought new server: 4 * 6 Core = 24 Cores CPU, 48 Gb RAM, 2 RAID controllers 256 Mb cache, with 8 SAS 15K HDD on each. 64bit OS. I'm wondering, what would be a fastest configuration: 1) FB 2.5 SuperServer with huge buffer 8192 * 3500000 pages = 29 Gb or 2) FB 2.5 Classic with small buffer of 1000 pages. Maybe some one has tested such case before and will save me days of work :) Thanks in advance.

    Read the article

  • Script Speed vs Memory Usage

    - by Doug Neiner
    I am working on an image generation script in PHP and have gotten it working two ways. One way is slow but uses a limited amount of memory, the second is much faster, but uses 6x the memory . There is no leakage in either script (as far as I can tell). In a limited benchmark, here is how they performed: -------------------------------------------- METHOD | TOTAL TIME | PEAK MEMORY | IMAGES -------------------------------------------- One | 65.626 | 540,036 | 200 Two | 20.207 | 3,269,600 | 200 -------------------------------------------- And here is the average of the previous numbers (if you don't want to do your own math): -------------------------------------------- METHOD | TOTAL TIME | PEAK MEMORY | IMAGES -------------------------------------------- One | 0.328 | 540,036 | 1 Two | 0.101 | 3,269,600 | 1 -------------------------------------------- Which method should I use and why? I anticipate this being used by a high volume of users, with each user making 10-20 requests to this script during a normal visit. I am leaning toward the faster method because though it uses more memory, it is for a 1/3 of the time and would reduce the number of concurrent requests.

    Read the article

  • How to set a filter for an estimated maximum price

    - by David
    I cannot figure out how to set an estimated maximum price for a collection of records. What I want to avoid is to simply use SQL MAX, because maybe there are records with exorbitant prices. For example, in the "computers-hardware" category of OLX (http://www.olx.com/computers-hardware-cat-240) the filter for maximum price is estimately set to $1400, but sorting by price, the first items are above $10000 Maybe they calculated the average and then estimated some maximum price... what do you think? And what about the stepping? How would you calculate it?

    Read the article

  • Does Microsoft hate firefox? ASP.Net gridview performance in firefox bug?

    - by Maxim Gershkovich
    Could someone please explain the significant difference in speed between a firefox updatepanel async postback and one performed in IE? Average Firefox Postback Time For 500 objects: 1.183 Second Average IE Postback Time For 500 objects: 0.295 Seconds Using firebug I can see that the majority of this time in FireFox is spent on the server side. A total of 1.04 seconds. Given this fact the only thing I can assume is causing this problem is the way that ASP.Net renders its controls between the two browsers. Has anyone run into this problem before? VB.Net Code Protected Sub Button1_Click(ByVal sender As Object, ByVal e As EventArgs) Handles Button1.Click GridView1.DataBind() End Sub Public Function GetStockList() As StockList Dim res As New StockList For l = 0 To 500 Dim x As New Stock With {.Description = "test", .ID = Guid.NewGuid} res.Add(x) Next Return res End Function Public Class Stock Private m_ID As Guid Private m_Description As String Public Sub New() End Sub Public Property ID() As Guid Get Return Me.m_ID End Get Set(ByVal value As Guid) Me.m_ID = value End Set End Property Public Property Description() As String Get Return Me.m_Description End Get Set(ByVal value As String) Me.m_Description = value End Set End Property End Class Public Class StockList Inherits List(Of Stock) End Class Markup <form id="form1" runat="server"> <asp:ScriptManager ID="ScriptManager1" runat="server"> </asp:ScriptManager> <script type="text/javascript" language="Javascript"> function timestamp_class(this_current_time, this_start_time, this_end_time, this_time_difference) { this.this_current_time = this_current_time; this.this_start_time = this_start_time; this.this_end_time = this_end_time; this.this_time_difference = this_time_difference; this.GetCurrentTime = GetCurrentTime; this.StartTiming = StartTiming; this.EndTiming = EndTiming; } //Get current time from date timestamp function GetCurrentTime() { var my_current_timestamp; my_current_timestamp = new Date(); //stamp current date & time return my_current_timestamp.getTime(); } //Stamp current time as start time and reset display textbox function StartTiming() { this.this_start_time = GetCurrentTime(); //stamp current time } //Stamp current time as stop time, compute elapsed time difference and display in textbox function EndTiming() { this.this_end_time = GetCurrentTime(); //stamp current time this.this_time_difference = (this.this_end_time - this.this_start_time) / 1000; //compute elapsed time return this.this_time_difference; } //--> </script> <script type="text/javascript" language="javascript"> var time_object = new timestamp_class(0, 0, 0, 0); //create new time object and initialize it Sys.WebForms.PageRequestManager.getInstance().add_beginRequest(BeginRequestHandler); Sys.WebForms.PageRequestManager.getInstance().add_endRequest(EndRequestHandler); function BeginRequestHandler(sender, args) { var elem = args.get_postBackElement(); ActivateAlertDiv('visible', 'divAsyncRequestTimer', elem.value + ''); time_object.StartTiming(); } function EndRequestHandler(sender, args) { ActivateAlertDiv('visible', 'divAsyncRequestTimer', '(' + time_object.EndTiming() + ' Seconds)'); } function ActivateAlertDiv(visstring, elem, msg) { var adiv = $get(elem); adiv.style.visibility = visstring; adiv.innerHTML = msg; } </script> <asp:UpdatePanel ID="UpdatePanel1" runat="server"> <Triggers> <asp:AsyncPostBackTrigger ControlID="Button1" EventName="click" /> </Triggers> <ContentTemplate> <asp:UpdateProgress ID="UpdateProgress1" runat="server" AssociatedUpdatePanelID="UpdatePanel1"> </asp:UpdateProgress> <asp:Button ID="Button1" runat="server" Text="Button" /> <div id="divAsyncRequestTimer" style="font-size:small;"> </div> <asp:GridView ID="GridView1" runat="server" DataSourceID="ObjectDataSource1" AutoGenerateColumns="False"> <Columns> <asp:BoundField DataField="ID" HeaderText="ID" SortExpression="ID" /> <asp:BoundField DataField="Description" HeaderText="Description" SortExpression="Description" /> </Columns> </asp:GridView> <asp:ObjectDataSource ID="ObjectDataSource1" runat="server" SelectMethod="GetStockList" TypeName="WebApplication1._Default"> </asp:ObjectDataSource> </ContentTemplate> </asp:UpdatePanel> </form>

    Read the article

  • How much time do you spend in Reflector? (.NET)

    - by mannu
    As a consultant I get to toy around with many different products and APIs as the customer demands we use X and Y. I think it is great fun and I learn a lot from it. What will make a great developer over time is, in my opinion, the will to understand and learn new things. Therefore, I will always try to understand what happens "behind the scenes" when I am using 3rd party products. I spend around 10-15% of my time in Reflector to learn what the heck I'm really doing when I call method X. How much time do you spend on average? This may also apply to reading (open) source code, documentation etc.

    Read the article

  • audio power on AudioQueue

    - by Tomoyuki
    Hi everyone. I'm now creating an Application using speech recognition.To check the Audio Power coming in through the microphone, I wrote a method as follows. -(void)checkPower(AudioqueRef)queue{ UInt32 expectedSize= sizeof(AudioQueueLevelMeterState); AudioQueueGetProperty(queue, kAudioQueueProperty_CurrentLevelMeter, audioLevels, expectedSize); NSLog(@"average:%f peak:%f",audioLevels.mAveragePower,audioLevels.mPeakPower); } I found that sometimes mAveragePower was larger than mPeakPower, and when mAveragePower was 1.0, in other words, averagePower is regarded as max, mPeakPower was lower than 1.0. I think that generally this result is inpossible. please Let me know if you have any information about sound power on CoreAudio. thanks.

    Read the article

  • How to allow sorting in the Django admin by a custom list_display field, which doesn't have a DB fie

    - by Gj
    I have a custom list_display field which is responsible for a column of integers in one of my admin pages. I need to allow staff members to sort according to it. There's a solution for how to acheive that if the integer represents a count/average/etc of some DB field, which is not the case for me. [ the solution for that case is here: http://stackoverflow.com/questions/2168475/django-admin-how-to-sort-by-one-of-the-custom-list-display-fields-that-has-no-da ] Any ideas how I can achieve this sorting without actually creating and maintaining a DB field for the values?

    Read the article

  • How to tell which thread(s) are producing all the garbage?

    - by Brad Hein
    I have an app with about 15 threads. Most do mundane tasks and sleep most of their lives. Others collect information and cache it in hashmaps. The hashmaps grow to a moderate size and level out. The number of keys and size of value remains constant, but the contents of the values changes (at 33 keys per second average). When I start my app, I notice the garbage collection interval goes from minutes to once per second, and the amount of garbage is 700k+ each time. In fact as I was writing this, it caused my phone to reboot with an error "Referencetable Overflow". Here's my question: Are there any tricks to identifying which threads are producing the garbage, or even finding out more about what garbage they are producing?

    Read the article

  • Tool for response time analysis on JBoss server?

    - by Ariel Vardi
    I am running a pretty high traffic cluster of JBoss servers serving REST requests and I am interested in tools reading the access logs in Tomcat format (with %D parameter) to provide a detailed analysis of the response time on a per-call basis. Ideally this tool would generate a chart showing the progression of the response time throughout the day, hour per hour, then a weekly view with averages on the day, and monthly with average on the weeks (CACTI style). I've looked for such tools and couldn't find anything. Is any of you guys aware of something close to that before I start writing my own? I haven't looked into CACTI extensions yet, but that be an option?

    Read the article

  • Why a thread is aborted in ASP.NET MVC (again)?

    - by Dario Solera
    Here is what I do in a controller action: create and start a new Thread that does a relatively long processing task (~30 seconds on average, but might be several minutes) immediately return the page response so the user knows processing has started (trivially, a Json with a task ID for polling purposes). At some random point, ThreadAbortException is thrown, so the async task does not complete. The exception is not thrown every time, it just happens randomly roughly 25% of the times. Points to note: I'm not calling Response.End or Response.Redirect - there isn't even a request running when the exception is thrown I tried using ThreadPool and I got the same behavior I know running threads in ASP.NET has several caveats but I don't care right now Any suggestion?

    Read the article

  • Tag Cloud Data Backend

    - by Waldron
    I want to be able to generate tag clouds from free text that comes from any number of different sources. For clarity, I'm not talking about how to display a tag cloud once the critical tags/phrases are already discovered, I'm hoping to be able to discover the meaningful phrases themselves... preferable on a PHP/MySQL stack. If I had to do this myself, I'd start by establishing some kind of index for words/phrases that gives a "normal" frequency for any word/phrase. eg "Constantinople" occurs once in every 1,000,000 words on average (normal frequency "0.000001"). Then as I analyze a body of text, I'd find the individual words/phrases (another challenge!), find frequencies of each within the input, and measure against the expected freqeuncy. Words that have the highest ratio against expected frequency get boosted priority in the cloud. I'd like to believe someone else has already done this, WAY better than I could hope to, but I'll be damned if I can find it. Any recommendations??

    Read the article

  • Creating huge images

    - by David Rutten
    My program has the feature to export a hi-res image of the working canvas to the disk. Users will frequently try to export images of about 20,000 x 10,000 pixels @ 32bpp which equals about 800MB. Add that to the serious memory consumption already going on in your average 3D CAD program and you'll pretty much guarantee an out-of-memory crash on 32-bit platforms. So now I'm exporting tiles of 1000x1000 pixels which the user has to stitch together afterwards in a pixel editor. Is there a way I can solve this problem without the user doing any work? I figured I could probably write a small exe that gets command-lined into the process and performs the stitching automatically. It would be a separate process and it would thus have 2GB of ram all to itself. Or is there a better way still? I'd like to support jpg, png and bmp so writing the image as a bytestream to the disk is not really possible.

    Read the article

  • How to use avg function?

    - by Marcelo
    I'm new at php and mysql stuff and i'm trying to use an avg function but i don't know how to. I'm trying to do something like this: mysql_connect(localhost,$username,$password); @mysql_select_db($database) or die ("Did not connect to $database"); mysql_query("AVG(column1) FROM table1 ") or die(mysql_error()); mysql_close(); echo AVG(column1); (Q1)I'd like to see the value printed in the screen, but i'm getting nothing but an error message. How could I print this average on the screen ? (Q2)If I had a column month in my table1, how could I print the averages by the months ? Sorry for any bad English, and thanks for the attention.

    Read the article

  • Tips for improving performance of DB that is above size 40 GB (Sql Server 2005) and growing monthly

    - by HotTester
    The current DB or our project has crossed over 40 GB this month and on an average it is growing monthly by around 3 GB. Now all the tables are best normalized and proper indexing has been used. But still as the size is growing it is taking more time to fire even basic queries like 'select count(1) from table'. So can u share some more points that will help in this front. Database is Sql Server 2005. Further if we implement Partitioning wouldn't it create a overhead ? Thanks in advance.

    Read the article

  • Need to get pixel averages of a vector sitting on a bitmap...

    - by user346511
    I'm currently involved in a hardware project where I am mapping triangular shaped LED to traditional bitmap images. I'd like to overlay a triangle vector onto an image and get the average pixel data within the bounds of that vector. However, I'm unfamiliar with the math needed to calculate this. Does anyone have an algorithm or a link that could send me in the right direction? I'm not even clear what this type of math is called. I've created a basic image of what I'm trying to capture here: http://imgur.com/Isjip.gif

    Read the article

  • How should I join these 3 SQL queries in Oracle?

    - by Nazgulled
    I have these 3 queries: SELECT title, year, MovieGenres(m.mid) genres, MovieDirectors(m.mid) directors, MovieWriters(m.mid) writers, synopsis, poster_url FROM movies m WHERE m.mid = 1; SELECT AVG(rating) FROM movie_ratings WHERE mid = 1; SELECT COUNT(rating) FROM movie_ratings WHERE mid = 1; And I need to join them into a single query. I was able to do it like this: SELECT title, year, MovieGenres(m.mid) genres, MovieDirectors(m.mid) directors, MovieWriters(m.mid) writers, synopsis, poster_url, AVG(rating) average, COUNT(rating) count FROM movies m INNER JOIN movie_ratings mr ON m.mid = mr.mid WHERE m.mid = 1 GROUP BY title, year, MovieGenres(m.mid), MovieDirectors(m.mid), MovieWriters(m.mid), synopsis, poster_url; But I don't really like that "huge" GROUP BY, is there a simpler way to do it?

    Read the article

  • Compute column widths in a HTML-like manner (based on cell contents)

    - by cipak
    Hi, I have a grid of data that I want to export to RTF, PDF etc. using various (and not perfect) PHP converters/generators. What I am missing most is the HTML table automatic adjustment of column widths based on the lengths of strings in the cells (strings contain line breaks which complicate things a bit, as they should be preserved). I need an algorithm that, given the contents of the cells (plain text), a total width of the table and an average width of a character, would return a width for each column. I wouldn't want to reinvent the wheel if something is already available. Of course it can't be perfect if the font is variable width, but an approximation would do just fine. Or maybe it could have a configurable table with widths for each character. Any hint would be appreciated. Thank you.

    Read the article

  • BCB: how to get the (approximate) width of a character in a given TFont?

    - by mawg
    It's a TMemo, not that that should make any difference. Googling suggests that I can use Canvas->TextWidth() but those are Delphi examples and BCB doesn't seem to offer this property. I really want something analogous to memo->Font->Height for width. I realize that not all fonts are fixed width, so a good estimate will do. All that I need is to take the width of a TMemo in pixels and make a reasonable guess at how many characters of the current font it will hold. Of course, if I really want to be lazy, I can just google for the average height/width ratio, since height is known. Remember, an approximation is good enough for me if it is tricky to get exact. http://www.plainlanguagenetwork.org/type/utbo211.htm says, " A width to height ratio of 3:5 (0.6) is recommended for most applications"

    Read the article

  • In MYSQL is it better to have one big table or many smaller tables

    - by user307922
    Hi All, I am making a database of my client's customers to send email promotions to. The database will include all about 12 of my clients and each of them has an average of 2100 customers. I was wondering if it would be better to have a table in the db for each one of my clients that contains a list of their customers or if I should just make one big table... The customers will be queried daily. I know it is a broad question but any advice would be appreciated. Cheers, Chuck

    Read the article

  • Creating a Dynamic Image using Friends pictures

    - by Narendra Rajput
    I am working on a Facebook for which I need to get all the Profile Pictures of Friends of the users. Which I did using FQL query. I need these images for creating a poster of all the Friends Profile pics along tags in them. For that I need to create a Dynamic poster for every user with their Friends tagged in them. I tried using the GD Library for PHP. I tried with the imagecreatefromjpeg() function of php for which I can use one image and pass it to the main image. But here I have more than 1 images (average about 100 images) depending on the number of friends the user has. What function do I need to create this dynamic poster ? Please any help would be appreciated !!

    Read the article

  • Float as DateTime

    - by lp1
    SQL Server 2008 I almost have, I think, what I'm looking to do. I'm just trying to fine tune the result. I have a table that stores timestamps of all transactions that occur on the system. I'm writing a query to give an average transaction time. This is what I have so far: With TransTime AS ( select endtime-starttime AS Totaltime from transactiontime where starttime > '2010-05-12' and endtime < '2010-05-13') Select CAST(AVG(CAST(TotalTime As Float))As Datetime) from TransTime I'm getting the following result: 1900-01-01 00:00:00.007 I can't figure out how to strip the date off and just display the time, 00:00:00:007. Any help would be appreciated. Thank you.

    Read the article

< Previous Page | 52 53 54 55 56 57 58 59 60 61 62 63  | Next Page >