Search Results

Search found 895 results on 36 pages for 'scatter plot'.

Page 19/36 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • matrix multiplication with MPI [on hold]

    - by user3695701
    I'm working on an assignment on matrix multiplication with MPI. A*B=C. the requirement is that B should be vertically partitioned. Here's what I intend to do: broadcast matrix A to all processes and scatter B into several slices with each slice containing n/p columns. The following code only works when the number of process(p) is 1. when p1(say 2), I got [cluster2:21080] *** Process received signal *** [cluster2:21080] Signal: Segmentation fault (11) [cluster2:21080] Signal code: Address not mapped (1) [cluster2:21080] Failing at address: (nil) [cluster2:21080] [ 0] /lib/libpthread.so.0(+0xf8f0) [0x7f49f38108f0] [cluster2:21080] [ 1] /lib/libc.so.6(memcpy+0xe1) [0x7f49f35024c1] [cluster2:21080] [ 2] /usr/lib/libmpi.so.0(ompi_convertor_unpack+0x121)[0x7f49f47c88e1] [cluster2:21080] [ 3] /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so(+0x8a26) [0x7f49f0dcea26] [cluster2:21080] [ 4] /usr/lib/openmpi/lib/openmpi/mca_btl_tcp.so(+0x662c) [0x7f49efce462c] [cluster2:21080] [ 5] /usr/lib/libopen-pal.so.0(+0x1ede8) [0x7f49f42e0de8] [cluster2:21080] [ 6] /usr/lib/libopen-pal.so.0(opal_progress+0x99) [0x7f49f42d5369] [cluster2:21080] [ 7] /usr/lib/openmpi/lib/openmpi/mca_pml_ob1.so(+0x5585) [0x7f49f0dcb585] [cluster2:21080] [ 8] /usr/lib/openmpi/lib/openmpi/mca_coll_tuned.so(+0xcc01) [0x7f49eeeb1c01] [cluster2:21080] [ 9] /usr/lib/openmpi/lib/openmpi/mca_coll_tuned.so(+0x266c) [0x7f49eeea766c] [cluster2:21080] [10] /usr/lib/openmpi/lib/openmpi/mca_coll_sync.so(+0x1388) [0x7f49ef0c0388] [cluster2:21080] [11] /usr/lib/libmpi.so.0(MPI_Bcast+0x10e) [0x7f49f47d025e] [cluster2:21080] [12] ./out(main+0x259) [0x401571] [cluster2:21080] [13] /lib/libc.so.6(__libc_start_main+0xfd) [0x7f49f3498c8d] [cluster2:21080] [14] ./out() [0x400f29] [cluster2:21080] *** End of error message *** Can someone help me? Thanks. //matrices A and B //double* A =(double *)malloc(n*n*sizeof(double)); //double* B =(double *)malloc(n*n*sizeof(double)); //code initializing A,B... //n is the size of the matrix //p is the number of processes //myrank is the rank of calling process MPI_Init (&argc, &argv); MPI_Comm_rank(MPI_COMM_WORLD, &myrank); MPI_Comm_size(MPI_COMM_WORLD, &p); //broadcast A to all processes MPI_Bcast (A, n*n, MPI_DOUBLE, 0, MPI_COMM_WORLD); MPI_Datatype tmp_type, col_type; // extract a slice from B MPI_Type_vector(n, num_of_col_per_slice, n, MPI_DOUBLE, &tmp_type); // position of the first (0) and each next (stride * sizeof(double) ) slice MPI_Type_create_resized(tmp_type, 0, n * sizeof(double), &col_type); MPI_Type_commit(&col_type); //scatter a slice of B to each process MPI_Scatter(B, 1, col_type, B+myrank*n/p, n * n/p, MPI_DOUBLE, 0, MPI_COMM_WORLD); //use blas function to calculate A*sliceOfB and store the resulting slice to C cblas_dgemm(CblasRowMajor, CblasNoTrans, CblasNoTrans, n, n/p, n, 1.0, A, n, B+myrank*n/p, n, 0.0, C+myrank*n/p, n); //gather all those resulting slices into C MPI_Gather (C+myrank*n/p, n*n/p, MPI_DOUBLE, C, n*n/p, MPI_DOUBLE, 0, MPI_COMM_WORLD);

    Read the article

  • CodePlex Daily Summary for Friday, April 23, 2010

    CodePlex Daily Summary for Friday, April 23, 2010New Projects3D TagCloud for SharePoint 2010: 3D Flash TagCloud WebPart for SharePoint 2010AnyCAD.Net: AnyCAD.NetCassandraemon: Cassandraemon is LINQ Provider for Apache Cassandra.CCLI SongSelect Importer for PowerPoint: CCLI SongSelect Importer for PowerPoint ® is an Add-in for Microsoft ® PowerPoint ® that allows CCLI SongSelect (USR) files to be turned into slide...Compactar Arquivo Txt, Flat File, em Pipeline Decoder Customizado: Objetivo do projeto: Desenvolver um componente do tipo Pipeline Receiver Decoder, onde compacta o conteúdo, cria uma mensagem em XML e transforma ...Console Calculator: Console calculator is a simple, yet useful, mathematical expression calculator, supporting functions and variables. It was created to demonstrate ...CRM Dynamics Excel Importer: CRM Dynamics Excel Importercubace: The standard audio composer software with just single difference: this is CLR compilation.deneb: deneb projectDrive Backup: Drive Backup is an easy to use, automatic backup program. Simply insert a USB drive, and the program will backup either files on the drive to your ...eWebMVCCMS: this is the start of eWeb MVC CMS.Fix.ly: Small app that allows for URL rewriting before passing to the browser. Accepts MEF plugins that make themselves available by informing the applicat...GArphics: GArphics uses a genetic algorithm to produce graphics and animation with the assitance of the user.JDS Toolkit: An experimental toolkit geared to make richer applications with less effort. It will include controls such as the cubeoid and the serializedmenu. ...KrashSRC - MapleStory v.75 Emulator: KrashSRC - MapleStory v.75 EmulatorLast.fm Api: Last.fm api writen in Visaul Basic 2010.MIX 10 DVR and Downloader: A Silverlight application that will manage downloading the sessions and slide decks from the MIX '10 Conference utilizing the MIX OData feed for in...NSIS Autorun: This is a graphical CD/DVD/USB autorun engine that launches installers made with NSIS. Non-rectangular windows and animation are supported. Can be ...Pillbox: Windows Phone 7 sample application for tracking medications.PowerSharp: Very simple application that executes a snippet of PowerShell against C#. This will eventually be used with Live@EDU.Project Halosis: mmorpgProyecto Cero: Proyecto CeroSharePoint XSL Templates: This project is a place to share useful XSL templates that can be reused in SharePoint CQWPs and DVWPs.Silverlight 4.0 Popup Menu: Silverlight 4.0 Popup Menu spsearch: This project provides useful enhancements to Search using the SharePoint platform.StereoVision: StereVision es un proyecto que estudia un algoritmo de visión estereocopicaThe Stoffenmanager: The Stoffenmanager is a tool for prioritizing worker health risks to dangerous substances. But also a quantitative inhalation exposure tool and a ...Transcriber: Transcribe text from one character set to another. Extensible, plug-in based architecture. Default plug-in uses XML rules files with regular expres...Wavelets experiments: эксперименты с вейвлетамиWindows Phone 7 World of Warcraft Armory Browser: A test project to learn a little about Windows Phone development and do a decent armory browserXAML Based Chat: Xaml based chat. A simple chat systemNew Releases#Nose: SharpNose v1.1: Configuration is now done by updating SharpNose.exe.config MEF support added - you can also add your favorite test framework discovery Two tes...Baml Localizer: Version 0.1 (alpha): This is the first release which should show the capabilities of Baml Localizer. The code might still change a lot, but the file formats should be q...BibWord : Microsoft Word Citation and Bibliography styles: APA with DOI - Proof of Concept: IntroductionThis release is a proof of concept (POC) demonstrating a possible way of adding a digital object identifier (DOI) field to the APA styl...Chargify.NET: Chargify.NET v0.685: Releasing Version 0.685 - Changed customer reference ID from Guid to String for systems that don't use Guid as the unique key. - Added method for g...Compactar Arquivo Txt, Flat File, em Pipeline Decoder Customizado: SampleZipDecodePipeline: Solution contem Projeto com o Decoder Pipeline. Projeto para usar o Componente. Classes SharpZipLib para compactar e descompactar arquivosConsole Calculator: Console Calculator: Initial source code release.CSharp Intellisense: V1.6: UPDATE: 2010/04/05: description was added 2010/04/07: single selection + reset filter 20010/04/15: source code available at http://csharpintellis...Drive Backup: Drive Backup: Drive Backup allows you to automatically backup a USB device to your computer, or backup files/directories on your computer to a USB. Once you have...Event Scavenger: Thread recycling changes - Version 3.1: Change the location of where the settings for thread recycling is stored - Moved from config file to database for easier management. Version of dat...Extend SmallBasic: Teaching Extensions v.013: Added Houses QuizExtend SmallBasic: Teaching Extensions v.014: fixed a bug in Tortoise.approve rearranged the Houses Quiz to be more funFix.ly: Fix.ly 0.1: Initial test releaseFix.ly: Fix.ly 0.11: Fixed a couple bugs, including missing files in the previous releaseGArphics: Beta: This is the beta-version of the program. Version 1.0 shall be relased soon and will include a lot of improvements.HouseFly: HouseFly alpha 0.2.0.5: HouseFly alpha release 0.2.0.5HouseFly controls: HouseFly controls alpha 0.9.4: Version 0.9.4 alpha release of HouseFly controlsHTML Ruby: 6.21.8: Change Math.floor to round for text spacingHTML Shot: 0.1: Solved problems with some URLsJDS Toolkit: JDS Toolkit 0.1: Beta 0.1 version. Almost nothing in these librariesManaged Extensibility Framework: WebForms and MEF Sample: This sample demonstrates the use of these two technologies together in a non-invasive way. Information on how to use it on your own projects is inc...Microsoft - Domain Oriented N-Layered .NET 4.0 App Sample (Microsoft Spain): V0.7 - N-Layer DDD Sample App (VS.2010 RTM compat): Required Software (Microsoft Base Software needed for Development environment) Visual Studio 2010 RTM & .NET 4.0 RTM (Final Versions) Unity Applic...MvcContrib Portable Areas: Portable Areas: First Release of some portable areasNSIS Autorun: NSIS Autorun: Initial release.OgmoXNA: OgmoXNA Alpha Source Tree: Zipped version of the source tree in case you don't want to go through the SVN!Particle Plot Pivot: Particle Plot Pivot v1.0.0: Generates a Pivot collection of unpublished plots from the particle physics exeriments DZERO, CDF, ATLAS, and CMS. It can be found at http://deepta...patterns & practices SharePoint Guidance: SPG2010 Drop9: SharePoint Guidance Drop Notes Microsoft patterns and practices ****************************************** ***************************************...Rich Ajax empowered Web/Cloud Applications: 6.3.15: New Visual WebGui rich applications platform versionSilverlight 4.0 Popup Menu: PopupMenu for Silverlight 4: This is the first release of the popup menu class for Silverlight 4.0Silverlight Flow Layouts library: SL and WPF Flow Layouts library April 2010: This release introduces WPF 4.0 RTM and Silverlight 4 RTM support, as well as an additional layout algorithm and some minor bug fixes. Some changes...Spackle.NET: 3.0.0.0 Release: In this release: Spackle.dll now targets the 4.0 version of the .NET Framework SecureRandom implements IDisposable ActionExtensions have been ...Splinger FrameXi: Splinger 1.1: Welcome to a whole new way of learning! Go to release 1.0 for the non .zip packaged files.SQL Server Metadata Toolkit 2008: SQL Server Metadata Toolkit Alpha 6: This release addresses issues 10665, 10678 and 10679. The SQL Parser now understands CAST functions (the AS was causing issues), and is installed ...Star Trooper for XNA 2D Tutorial: Lesson four content: Here is Lesson four original content for the StarTrooper 2D XNA tutorial. It also includes the XNA version of Lesson four source. The blog tutori...Thales Simulator Library: Version 0.8.6: The Thales Simulator Library is an implementation of a software emulation of the Thales (formerly Zaxus & Racal) Hardware Security Module cryptogra...Transcriber: Transcriber v0.1: Initial alpha release. Very nearly useful. :-) This version includes rules files for Mode of Beleriand, Sindarin Tehtar, Quenya, and Black Speech. ...Visual Studio DSite: Picture Box Viewer (Visual F sharp 2008): A simple picturebox viewer made in visual f sharp 2008.Web/Cloud Applications Development Framework | Visual WebGui: 6.4 Beta 2d: Further stabilization of the cutting-edge web applications frameworkWebAssert: WebAssert 0.1: Initial release. Supports HTML & CSS validation using MSTest/Visual Studio testing.XAML Based Chat: Test release: A test releaseすとれおじさん(仮): すとれおじさん β 0.02: ・デザインを大幅に変更 ・まだかなり動作が重いです ・機能も少ないですMost Popular ProjectsRawrWBFS ManagerSilverlight ToolkitAJAX Control ToolkitMicrosoft SQL Server Product Samples: Databasepatterns & practices – Enterprise LibraryWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesPHPExcelMost Active Projectspatterns & practices – Enterprise LibraryRawrParticle Plot PivotBlogEngine.NETNB_Store - Free DotNetNuke Ecommerce Catalog ModuleGMap.NET - Great Maps for Windows Forms & PresentationFarseer Physics EngineDotNetZip LibraryFluent Ribbon Control SuiteN2 CMS

    Read the article

  • jquery with ASP.NET MVC - calling ajax enabled web service

    - by dcp
    This is a bit of a continuation of a previous question. Now I'm trying to make a call to an AJAX enabled web service which I have defined within the ASP.NET MVC application (i.e. the MovieService.svc). But the service is never being called in my getMovies javascript function. This same technique of calling the AJAX web service works ok if I try it in a non ASP.NET MVC application, so it makes me wonder if maybe the ASP MVC routes are interfering with things somehow when it tries to make the AJAX web service call. Do you have any idea why my web service isn't getting called? Code below. <script src="<%= ResolveClientUrl("~/scripts/jquery-1.4.2.min.js") %>" type="text/javascript"></script> <script src="<%= ResolveClientUrl("~/scripts/grid.locale-en.js") %>" type="text/javascript"></script> <script src="<%= ResolveClientUrl("~/scripts/jquery-ui-1.8.1.custom.min.js") %>" type="text/javascript"></script> <script src="<%= ResolveClientUrl("~/scripts/jquery.jqGrid.min.js") %>" type="text/javascript"></script> <script type="text/javascript"> var lastsel2; function successFunction(jsondata) { debugger var thegrid = jQuery("#editgrid"); for (var i = 0; i < jsondata.d.length; i++) { thegrid.addRowData(i + 1, jsondata.d[i]); } } function getMovies() { debugger // ***** the MovieService#GetMovies method never gets called $.ajax({ url: 'MovieService.svc/GetMovies', data: "{}", // For empty input data use "{}", dataType: "json", type: "GET", contentType: "application/json; charset=utf-8", success: successFunction }); } jQuery(document).ready(function() { jQuery("#editgrid").jqGrid({ datatype: getMovies, colNames: ['id', 'Movie Name', 'Directed By', 'Release Date', 'IMDB Rating', 'Plot', 'ImageURL'], colModel: [ { name: 'id', index: 'Id', width: 55, sortable: false, hidden: true, editable: false, editoptions: { readonly: true, size: 10} }, { name: 'Movie Name', index: 'Name', width: 250, editable: true, editoptions: { size: 10} }, { name: 'Directed By', index: 'Director', width: 250, align: 'right', editable: true, editoptions: { size: 10} }, { name: 'Release Date', index: 'ReleaseDate', width: 100, align: 'right', editable: true, editoptions: { size: 10} }, { name: 'IMDB Rating', index: 'IMDBUserRating', width: 100, align: 'right', editable: true, editoptions: { size: 10} }, { name: 'Plot', index: 'Plot', width: 150, hidden: false, editable: true, editoptions: { size: 30} }, { name: 'ImageURL', index: 'ImageURL', width: 55, hidden: true, editable: false, editoptions: { readonly: true, size: 10} } ], pager: jQuery('#pager'), rowNum: 5, rowList: [5, 10, 20], sortname: 'id', sortorder: "desc", height: '100%', width: '100%', viewrecords: true, imgpath: '/Content/jqGridCss/redmond/images', caption: 'Movies from 2008', editurl: '/Home/EditMovieData/', caption: 'Movie List' }); $("#bedata").click(function() { var gr = jQuery("#editgrid").jqGrid('getGridParam', 'selrow'); if (gr != null) jQuery("#editgrid").jqGrid('editGridRow', gr, { height: 280, reloadAfterSubmit: false }); else alert("Hey dork, please select a row"); }); }); </script> <h2> <%= Html.Encode(ViewData["Message"]) %></h2> <p> To learn more about ASP.NET MVC visit <a href="http://asp.net/mvc" title="ASP.NET MVC Website"> http://asp.net/mvc</a>. </p> <table id="editgrid"> </table> <div id="pager" style="text-align: center;"> </div> <input type="button" id="bedata" value="Edit Selected" /> Here's my RegisterRoutes code: public static void RegisterRoutes(RouteCollection routes) { routes.IgnoreRoute("{resource}.axd/{*pathInfo}"); routes.IgnoreRoute("*MovieService.svc*"); routes.MapRoute( "Default", // Route name "{controller}/{action}/{id}", // URL with parameters new { controller = "Home", action = "Index", id = "" } // Parameter defaults ); } Here's what my MovieService class looks like: namespace jQueryMVC { [ServiceContract(Namespace = "")] [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class MovieService { // Add [WebGet] attribute to use HTTP GET [OperationContract] [WebGet(ResponseFormat = WebMessageFormat.Json)] public IList<Movie> GetMovies() { return Persistence.GetMovies(); } } }

    Read the article

  • subset in geom_point SOMETIMES returns full dataset, instead of none.

    - by Andreas
    I ask the following in the hope that someone might come up with a generic description about the problem.Basically I have no idea whats wrong with my code. When I run the code below, plot nr. 8 turns out wrong. Specifically the subset in geom_point does not work the way it should. (update: With plot nr. 8 the whole dataset is plottet, instead of only the subset). If somebody can tell me what the problem is, I'll update this post. SOdata <- structure(list(id = 10:55, one = c(7L, 8L, 7L, NA, 7L, 8L, 5L, 7L, 7L, 8L, NA, 10L, 8L, NA, NA, NA, NA, 6L, 5L, 6L, 8L, 4L, 7L, 6L, 9L, 7L, 5L, 6L, 7L, 6L, 5L, 8L, 8L, 7L, 7L, 6L, 6L, 8L, 6L, 8L, 8L, 7L, 7L, 5L, 5L, 8L), two = c(7L, NA, 8L, NA, 10L, 10L, 8L, 9L, 4L, 10L, NA, 10L, 9L, NA, NA, NA, NA, 7L, 8L, 9L, 10L, 9L, 8L, 8L, 8L, 8L, 8L, 9L, 10L, 8L, 8L, 8L, 10L, 9L, 10L, 8L, 9L, 10L, 8L, 8L, 7L, 10L, 8L, 9L, 7L, 9L), three = c(7L, 10L, 7L, NA, 10L, 10L, NA, 10L, NA, NA, NA, NA, 10L, NA, NA, 4L, NA, 7L, 7L, 4L, 10L, 10L, 7L, 4L, 7L, NA, 10L, 4L, 7L, 7L, 7L, 10L, 10L, 7L, 10L, 4L, 10L, 10L, 10L, 4L, 10L, 10L, 10L, 10L, 7L, 10L), four = c(7L, 10L, 4L, NA, 10L, 7L, NA, 7L, NA, NA, NA, NA, 10L, NA, NA, 4L, NA, 10L, 10L, 7L, 10L, 10L, 7L, 7L, 7L, NA, 10L, 7L, 4L, 10L, 4L, 7L, 10L, 2L, 10L, 4L, 12L, 4L, 7L, 10L, 10L, 12L, 12L, 4L, 7L, 10L), five = c(7L, NA, 6L, NA, 8L, 8L, 7L, NA, 9L, NA, NA, NA, 9L, NA, NA, NA, NA, 7L, 8L, NA, NA, 7L, 7L, 4L, NA, NA, NA, NA, 5L, 6L, 5L, 7L, 7L, 6L, 9L, NA, 10L, 7L, 8L, 5L, 7L, 10L, 7L, 4L, 5L, 10L), six = structure(c(1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L), .Label = c("2010-05-25", "2010-05-27", "2010-06-07"), class = "factor"), seven = c(0.777777777777778, 0.833333333333333, 0.333333333333333, 0.888888888888889, 0.5, 0.888888888888889, 0.777777777777778, 0.722222222222222, 0.277777777777778, 0.611111111111111, 0.722222222222222, 1, 0.888888888888889, 0.722222222222222, 0.555555555555556, NA, 0, 0.666666666666667, 0.666666666666667, 0.833333333333333, 0.833333333333333, 0.833333333333333, 0.833333333333333, 0.722222222222222, 0.833333333333333, 0.888888888888889, 0.666666666666667, 1, 0.777777777777778, 0.722222222222222, 0.5, 0.833333333333333, 0.722222222222222, 0.388888888888889, 0.722222222222222, 1, 0.611111111111111, 0.777777777777778, 0.722222222222222, 0.944444444444444, 0.555555555555556, 0.666666666666667, 0.722222222222222, 0.444444444444444, 0.333333333333333, 0.777777777777778), eight = c(0.666666666666667, 0.333333333333333, 0.833333333333333, 0.666666666666667, 1, 1, 0.833333333333333, 0.166666666666667, 0.833333333333333, 0.833333333333333, 1, 1, 0.666666666666667, 0.666666666666667, 0.333333333333333, 0.5, 0, 0.666666666666667, 0.5, 1, 0.666666666666667, 0.5, 0.666666666666667, 0.666666666666667, 0.666666666666667, 0.333333333333333, 0.333333333333333, 1, 0.666666666666667, 0.833333333333333, 0.666666666666667, 0.666666666666667, 0.5, 0, 0.833333333333333, 1, 0.666666666666667, 0.5, 0.666666666666667, 0.666666666666667, 0.5, 1, 0.833333333333333, 0.666666666666667, 0.833333333333333, 0.666666666666667), nine = c(0.307692307692308, NA, 0.461538461538462, 0.538461538461538, 1, 0.769230769230769, 0.538461538461538, 0.692307692307692, 0, 0.153846153846154, 0.769230769230769, NA, 0.461538461538462, NA, NA, NA, NA, 0, 0.615384615384615, 0.615384615384615, 0.769230769230769, 0.384615384615385, 0.846153846153846, 0.923076923076923, 0.615384615384615, 0.692307692307692, 0.0769230769230769, 0.846153846153846, 0.384615384615385, 0.384615384615385, 0.461538461538462, 0.384615384615385, 0.461538461538462, NA, 0.923076923076923, 0.692307692307692, 0.615384615384615, 0.615384615384615, 0.769230769230769, 0.0769230769230769, 0.230769230769231, 0.692307692307692, 0.769230769230769, 0.230769230769231, 0.769230769230769, 0.615384615384615), ten = c(0.875, 0.625, 0.375, 0.75, 0.75, 0.75, 0.625, 0.875, 1, 0.125, 1, NA, 0.625, 0.75, 0.75, 0.375, NA, 0.625, 0.5, 0.75, 0.875, 0.625, 0.875, 0.75, 0.625, 0.875, 0.5, 0.75, 0, 0.5, 0.875, 1, 0.75, 0.125, 0.5, 0.5, 0.5, 0.625, 0.375, 0.625, 0.625, 0.75, 0.875, 0.375, 0, 0.875), elleven = c(1, 0.8, 0.7, 0.9, 0, 1, 0.9, 0.5, 0, 0.8, 0.8, NA, 0.8, NA, NA, 0.8, NA, 0.4, 0.8, 0.5, 1, 0.4, 0.5, 0.9, 0.8, 1, 0.8, 0.5, 0.3, 0.9, 0.2, 1, 0.8, 0.1, 1, 0.8, 0.5, 0.2, 0.7, 0.8, 1, 0.9, 0.6, 0.8, 0.2, 1), twelve = c(0.666666666666667, NA, 0.133333333333333, 1, 1, 0.8, 0.4, 0.733333333333333, NA, 0.933333333333333, NA, NA, 0.6, 0.533333333333333, NA, 0.533333333333333, NA, 0, 0.6, 0.533333333333333, 0.733333333333333, 0.6, 0.733333333333333, 0.666666666666667, 0.533333333333333, 0.733333333333333, 0.466666666666667, 0.733333333333333, 1, 0.733333333333333, 0.666666666666667, 0.533333333333333, NA, 0.533333333333333, 0.6, 0.866666666666667, 0.466666666666667, 0.533333333333333, 0.333333333333333, 0.6, 0.6, 0.866666666666667, 0.666666666666667, 0.6, 0.6, 0.533333333333333)), .Names = c("id", "one", "two", "three", "four", "five", "six", "seven", "eight", "nine", "ten", "elleven", "twelve"), class = "data.frame", row.names = c(NA, -46L)) iqr <- function(x, ...) { qs <- quantile(as.numeric(x), c(0.25, 0.5, 0.75), na.rm = T) names(qs) <- c("ymin", "y", "ymax") qs } magic <- function(y, ...) { high <- median(SOdata[[y]], na.rm=T)+1.5*sd(SOdata[[y]],na.rm=T) low <- median(SOdata[[y]], na.rm=T)-1.5*sd(SOdata[[y]],na.rm=T) ggplot(SOdata, aes_string(x="six", y=y))+ stat_summary(fun.data="iqr", geom="crossbar", fill="grey", alpha=0.3)+ geom_point(data = SOdata[SOdata[[y]] > high,], position=position_jitter(w=0.1, h=0),col="green", alpha=0.5)+ geom_point(data = SOdata[SOdata[[y]] < low,], position=position_jitter(w=0.1, h=0),col="red", alpha=0.5)+ stat_summary(fun.y=median, geom="point",shape=18 ,size=4, col="orange") } for (i in names(SOdata)[-c(1,7)]) { p<- magic(i) ggsave(paste("magig_plot_",i,".png",sep=""), plot=p, height=3.5, width=5.5) }

    Read the article

  • The dynamic Type in C# Simplifies COM Member Access from Visual FoxPro

    - by Rick Strahl
    I’ve written quite a bit about Visual FoxPro interoperating with .NET in the past both for ASP.NET interacting with Visual FoxPro COM objects as well as Visual FoxPro calling into .NET code via COM Interop. COM Interop with Visual FoxPro has a number of problems but one of them at least got a lot easier with the introduction of dynamic type support in .NET. One of the biggest problems with COM interop has been that it’s been really difficult to pass dynamic objects from FoxPro to .NET and get them properly typed. The only way that any strong typing can occur in .NET for FoxPro components is via COM type library exports of Visual FoxPro components. Due to limitations in Visual FoxPro’s type library support as well as the dynamic nature of the Visual FoxPro language where few things are or can be described in the form of a COM type library, a lot of useful interaction between FoxPro and .NET required the use of messy Reflection code in .NET. Reflection is .NET’s base interface to runtime type discovery and dynamic execution of code without requiring strong typing. In FoxPro terms it’s similar to EVALUATE() functionality albeit with a much more complex API and corresponiding syntax. The Reflection APIs are fairly powerful, but they are rather awkward to use and require a lot of code. Even with the creation of wrapper utility classes for common EVAL() style Reflection functionality dynamically access COM objects passed to .NET often is pretty tedious and ugly. Let’s look at a simple example. In the following code I use some FoxPro code to dynamically create an object in code and then pass this object to .NET. An alternative to this might also be to create a new object on the fly by using SCATTER NAME on a database record. How the object is created is inconsequential, other than the fact that it’s not defined as a COM object – it’s a pure FoxPro object that is passed to .NET. Here’s the code: *** Create .NET COM InstanceloNet = CREATEOBJECT('DotNetCom.DotNetComPublisher') *** Create a Customer Object Instance (factory method) loCustomer = GetCustomer() loCustomer.Name = "Rick Strahl" loCustomer.Company = "West Wind Technologies" loCustomer.creditLimit = 9999999999.99 loCustomer.Address.StreetAddress = "32 Kaiea Place" loCustomer.Address.Phone = "808 579-8342" loCustomer.Address.Email = "[email protected]" *** Pass Fox Object and echo back values ? loNet.PassRecordObject(loObject) RETURN FUNCTION GetCustomer LOCAL loCustomer, loAddress loCustomer = CREATEOBJECT("EMPTY") ADDPROPERTY(loCustomer,"Name","") ADDPROPERTY(loCustomer,"Company","") ADDPROPERTY(loCUstomer,"CreditLimit",0.00) ADDPROPERTY(loCustomer,"Entered",DATETIME()) loAddress = CREATEOBJECT("Empty") ADDPROPERTY(loAddress,"StreetAddress","") ADDPROPERTY(loAddress,"Phone","") ADDPROPERTY(loAddress,"Email","") ADDPROPERTY(loCustomer,"Address",loAddress) RETURN loCustomer ENDFUNC Now prior to .NET 4.0 you’d have to access this object passed to .NET via Reflection and the method code to do this would looks something like this in the .NET component: public string PassRecordObject(object FoxObject) { // *** using raw Reflection string Company = (string) FoxObject.GetType().InvokeMember( "Company", BindingFlags.GetProperty,null, FoxObject,null); // using the easier ComUtils wrappers string Name = (string) ComUtils.GetProperty(FoxObject,"Name"); // Getting Address object – then getting child properties object Address = ComUtils.GetProperty(FoxObject,"Address");    string Street = (string) ComUtils.GetProperty(FoxObject,"StreetAddress"); // using ComUtils 'Ex' functions you can use . Syntax     string StreetAddress = (string) ComUtils.GetPropertyEx(FoxObject,"AddressStreetAddress"); return Name + Environment.NewLine + Company + Environment.NewLine + StreetAddress + Environment.NewLine + " FOX"; } Note that the FoxObject is passed in as type object which has no specific type. Since the object doesn’t exist in .NET as a type signature the object is passed without any specific type information as plain non-descript object. To retrieve a property the Reflection APIs like Type.InvokeMember or Type.GetProperty().GetValue() etc. need to be used. I made this code a little simpler by using the Reflection Wrappers I mentioned earlier but even with those ComUtils calls the code is pretty ugly requiring passing the objects for each call and casting each element. Using .NET 4.0 Dynamic Typing makes this Code a lot cleaner Enter .NET 4.0 and the dynamic type. Replacing the input parameter to the .NET method from type object to dynamic makes the code to access the FoxPro component inside of .NET much more natural: public string PassRecordObjectDynamic(dynamic FoxObject) { // *** using raw Reflection string Company = FoxObject.Company; // *** using the easier ComUtils class string Name = FoxObject.Name; // *** using ComUtils 'ex' functions to use . Syntax string Address = FoxObject.Address.StreetAddress; return Name + Environment.NewLine + Company + Environment.NewLine + Address + Environment.NewLine + " FOX"; } As you can see the parameter is of type dynamic which as the name implies performs Reflection lookups and evaluation on the fly so all the Reflection code in the last example goes away. The code can use regular object ‘.’ syntax to reference each of the members of the object. You can access properties and call methods this way using natural object language. Also note that all the type casts that were required in the Reflection code go away – dynamic types like var can infer the type to cast to based on the target assignment. As long as the type can be inferred by the compiler at compile time (ie. the left side of the expression is strongly typed) no explicit casts are required. Note that although you get to use plain object syntax in the code above you don’t get Intellisense in Visual Studio because the type is dynamic and thus has no hard type definition in .NET . The above example calls a .NET Component from VFP, but it also works the other way around. Another frequent scenario is an .NET code calling into a FoxPro COM object that returns a dynamic result. Assume you have a FoxPro COM object returns a FoxPro Cursor Record as an object: DEFINE CLASS FoxData AS SESSION OlePublic cAppStartPath = "" FUNCTION INIT THIS.cAppStartPath = ADDBS( JustPath(Application.ServerName) ) SET PATH TO ( THIS.cAppStartpath ) ENDFUNC FUNCTION GetRecord(lnPk) LOCAL loCustomer SELECT * FROM tt_Cust WHERE pk = lnPk ; INTO CURSOR TCustomer IF _TALLY < 1 RETURN NULL ENDIF SCATTER NAME loCustomer MEMO RETURN loCustomer ENDFUNC ENDDEFINE If you call this from a .NET application you can now retrieve this data via COM Interop and cast the result as dynamic to simplify the data access of the dynamic FoxPro type that was created on the fly: int pk = 0; int.TryParse(Request.QueryString["id"],out pk); // Create Fox COM Object with Com Callable Wrapper FoxData foxData = new FoxData(); dynamic foxRecord = foxData.GetRecord(pk); string company = foxRecord.Company; DateTime entered = foxRecord.Entered; This code looks simple and natural as it should be – heck you could write code like this in days long gone by in scripting languages like ASP classic for example. Compared to the Reflection code that previously was necessary to run similar code this is much easier to write, understand and maintain. For COM interop and Visual FoxPro operation dynamic type support in .NET 4.0 is a huge improvement and certainly makes it much easier to deal with FoxPro code that calls into .NET. Regardless of whether you’re using COM for calling Visual FoxPro objects from .NET (ASP.NET calling a COM component and getting a dynamic result returned) or whether FoxPro code is calling into a .NET COM component from a FoxPro desktop application. At one point or another FoxPro likely ends up passing complex dynamic data to .NET and for this the dynamic typing makes coding much cleaner and more readable without having to create custom Reflection wrappers. As a bonus the dynamic runtime that underlies the dynamic type is fairly efficient in terms of making Reflection calls especially if members are repeatedly accessed. © Rick Strahl, West Wind Technologies, 2005-2010Posted in COM  FoxPro  .NET  CSharp  

    Read the article

  • Monitoring the Application alongside SQL Server

    - by Tony Davis
    Sometimes, on Simple-Talk, it takes a while to spot strange and unexpected patterns of user activity, or small bugs. For example, one morning we spotted that an article’s comment count had leapt to 1485, but that only four were displayed. With some rooting around in Google Analytics, and the endlessly annoying Community Server admin-interface, we were able to work out that a few days previously the article had been subject to a spam attack and that the comment count was for some reason including both accepted and unaccepted comments (which in turn uncovered a bug in the SQL). This sort of incident made us a lot keener on monitoring Simple-talk website usage more effectively. However, the metrics we wanted are troublesome, because they are far too specific for Google Analytics to measure, and the SQL Server backend doesn’t keep sufficient information to enable us to plot trends. The latter could provide, for example, the total number of comments made on, or votes cast for, articles, over all time, but not the number that occur by hour over a set time. We lacked a baseline, in other words. We couldn’t alter the database, as it is a bought-in package. We had neither the resources nor inclination to build-in dedicated application monitoring. Possibly, we could investigate a third-party tool to do the job; but then it occurred to us that we were already using a monitoring tool (SQL Monitor) to keep an eye on the database. It stored data, made graphs and sent alerts. Could we get it to monitor some aspects of the application as well? Of course, SQL Monitor’s single purpose is to check and monitor SQL Server, over time, rather than to monitor applications that use SQL Server. However, how different is the business of gathering and plotting SQL Server Wait Stats, from gathering and plotting various aspects of user activity on the site? Not a lot, it turns out. The latest version allows us to write our own custom monitoring scripts, meaning that we could now monitor any metric in the application that returns an integer. It took little time to write a simple SQL Query that collects basic metrics of the total number of subscribers, votes cast, comments made, or views of articles, over time. The SQL Monitor database polls Simple-Talk every second or so in order to get the latest totals, and can then store and plot this information, or even correlate SQL Server usage to application usage. You can see the live data by visiting monitor.red-gate.com. Click the "Analysis" tab, and select one of the "Simple-talk:" entries in the "Show" box and an appropriate data range (e.g. last 30 days). It’s nascent, and we’re still working on it, but it’s already given us more confidence that we’ll spot quickly trends, bugs, or bursts of ‘abnormal’ activity. If there is a sudden rise in comments, we get an alert, and if it’s due to a spam attack, we can moderate or ban the perpetrator very quickly. We’ve often argued that a tool should perform a single job well rather than turn into a Swiss-army knife, but ironically we’ve rather appreciated being able to make best use of what’s there anyway for a slightly different purpose. Is this a good or common practice? What do you think? Cheers, Tony.

    Read the article

  • Book Review (Book 12) - 20 Master Plots

    - by BuckWoody
    This is a continuation of the books I challenged myself to read to help my career - one a month, for a year. You can read my first book review here, and the entire list is here. The book I chose for May 2012 was:20 Master Plots by Ronald B. Tobias. This is my final book review - at least for this year. I'll explain what I've learned in this book in particular, and in the last twelve months in general. Why I chose this book: Stories and themes are part of software, presenting, and working in teams. This book claims there are only 20 plots, ever. I wanted to find out. What I learned: Probably my most favorite read of the year. Deceptively small, amazingly insightful. The premise is that there are only a few "base" themes, and that once you learn them you can put together an interesting set of stories on most any topic. Yes, the author admits that this number has been different throughout history - some have said 50, others 14, and still others claim only one or two basic plots. This doesn't change the fact that you can build very complex stories from a simple set of circumstances and characters. Be warned - if you read this book it takes away much of the wonder from almost every movie or book you'll read from here on! I loved it. My favorite part is that the author gives you exercises to build stories, right from the start. I've actually used these as the start of a meeting to foster creativity. Amazing stuff. One of my favorite sections of the book deals with plot and story. Plot: The king died, and the queen died. Story: The king died, and the queen died of heartbreak. Add one or two words, and you have the essence of storytelling. A highly recommended read, for all folks of all ages. You'll like it, your spouse will like it, and your kids will like it. I learned to be a better storyteller, and it helped me understand that plots and stories are not just things in books - they are a direct reflection of human nature. That makes me a better manager of myself and others.   And this is the last of the reviews - at least for this year. I probably won't post many more book reviews here, but I will keep up the practice. As a reminder, the goal was to select 12 books that will help you reach your career goals. They don't have to be technical, or even apply directly to your job - but they do need to be books that you mindfully select as getting you closer to what you want to be. Each month, jot down what you learned from the work. And see if it doesn't in fact get you closer to your goals. These readings helped me - I got a promotion this year, and I attribute at least some of that to the things I learned.

    Read the article

  • Book Review (Book 12) - 20 Master Plots

    - by BuckWoody
    This is a continuation of the books I challenged myself to read to help my career - one a month, for a year. You can read my first book review here, and the entire list is here. The book I chose for May 2012 was:20 Master Plots by Ronald B. Tobias. This is my final book review - at least for this year. I'll explain what I've learned in this book in particular, and in the last twelve months in general. Why I chose this book: Stories and themes are part of software, presenting, and working in teams. This book claims there are only 20 plots, ever. I wanted to find out. What I learned: Probably my most favorite read of the year. Deceptively small, amazingly insightful. The premise is that there are only a few "base" themes, and that once you learn them you can put together an interesting set of stories on most any topic. Yes, the author admits that this number has been different throughout history - some have said 50, others 14, and still others claim only one or two basic plots. This doesn't change the fact that you can build very complex stories from a simple set of circumstances and characters. Be warned - if you read this book it takes away much of the wonder from almost every movie or book you'll read from here on! I loved it. My favorite part is that the author gives you exercises to build stories, right from the start. I've actually used these as the start of a meeting to foster creativity. Amazing stuff. One of my favorite sections of the book deals with plot and story. Plot: The king died, and the queen died. Story: The king died, and the queen died of heartbreak. Add one or two words, and you have the essence of storytelling. A highly recommended read, for all folks of all ages. You'll like it, your spouse will like it, and your kids will like it. I learned to be a better storyteller, and it helped me understand that plots and stories are not just things in books - they are a direct reflection of human nature. That makes me a better manager of myself and others.   And this is the last of the reviews - at least for this year. I probably won't post many more book reviews here, but I will keep up the practice. As a reminder, the goal was to select 12 books that will help you reach your career goals. They don't have to be technical, or even apply directly to your job - but they do need to be books that you mindfully select as getting you closer to what you want to be. Each month, jot down what you learned from the work. And see if it doesn't in fact get you closer to your goals. These readings helped me - I got a promotion this year, and I attribute at least some of that to the things I learned.

    Read the article

  • How would i down-sample a .wav file then reconstruct it using nyquist? - in matlab [closed]

    - by martin
    Possible Duplicate: How would i down-sample a .wav file then reconstruct it using nyquist? - in matlab This is all done in MatLab 2010 My objective is to show the results of: undersampling, nyquist rate/ oversampling First i need to downsample the .wav file to get an incomplete/ or impartial data stream that i can then reconstuct. Heres the flow chart of what im going to be doing So the flow is analog signal - sampling analog filter - ADC - resample down - resample up - DAC - reconstruction analog filter what needs to be achieved: F= Frequency F(Hz=1/s) E.x. 100Hz = 1000 (Cyc/sec) F(s)= 1/(2f) Example problem: 1000 hz = Highest frequency 1/2(1000hz) = 1/2000 = 5x10(-3) sec/cyc or a sampling rate of 5ms This is my first signal processing project using matlab. what i have so far. % Fs = frequency sampled (44100hz or the sampling frequency of a cd) [test,fs]=wavread('test.wav'); % loads the .wav file left=test(:,1); % Plot of the .wav signal time vs. strength time=(1/44100)*length(left); t=linspace(0,time,length(left)); plot(t,left) xlabel('time (sec)'); ylabel('relative signal strength') **%this is were i would need to sample it at the different frequecys (both above and below and at) nyquist frequency.*I think.*** soundsc(left,fs) % shows the resaultant audio file , which is the same as original ( only at or above nyquist frequency however) Can anyone tell me how to make it better, and how to do the various sampling at different frequencies?

    Read the article

  • iphone mkmapview

    - by karthik
    Hi, This is karthik. I am developing an app using mapview in iphone. I have to use two different pins to show user locations. That two users are different type . One user is belong to some category and another one is some other category. I have to check that category and want to plot pin depend upon that category. When I try to do that, that pin is not showing properly. For a certain category it shows another pin, not what i set there. Can anyone help me? What i need is i want to load two different pins in map depend upon some category. For example , if that category is shopping mall then i have to show building image then if there is any garden i have to show some other image at a same time not one by one . I want to plot two pins with different category at the same time. For that,what could i do? I tried lot of solution , but its not showing properly. Help me?

    Read the article

  • iPhone: variable type returned by yajl

    - by Luc
    Hello, I'm quite new to iphone programming and I want to do the following stuff: get data from a JSON REST web server parse the received data using YAJL Draw a graph with those data using core-plot So, 1th item is fine, I use ASIHttpRequest which runs as espected 3rd is almost fine (I still have to learn how to tune core-plot). The problem I have is regarding 2nd item. I use YAJL as it seems to be the faster parser, so why not give it a try :) Here is the part of code that gets the data from the server and parse them: // Get server data response_data = [request responseData]; // Parse JSON received self.arrayFromData = [response_data yajl_JSON]; NSLog(@"Array from data: %@", self.arrayFromData); The parsing works quite well in fact, the NSLog output is something like: 2010-06-14 17:56:35.375 TEST_APP[3733:207] Array from data : { data = ( { val = 1317; date = "2010-06-10T15:50:01+02:00"; }, { val = 1573; date = "2010-06-10T16:20:01+02:00"; }, ........ { val = 840; date = "2010-06-11T14:50:01+02:00"; }, { val = 1265; date = "2010-06-11T15:20:01+02:00"; } ); from = "2010-06-10T15:50:01+02:00"; to = "2010-06-11T15:20:01+02:00"; max = "2590"; } According to th yajl-objc explanations http://github.com/gabriel/yajl-objc, the parsing returns a NSArray... The thing is... I do not know how to get all the values from it as for me it looks more like a NSDictionary than a NSArray... Could you please help ? Thanks a lot, Luc edit1: it happens that this object is actually a NSCFDictionary (!), I am still not able to get value from it, when I try the objectFromKey method (that should work on a Dictionary, no ?) it fails.

    Read the article

  • Google Maps API - Get points along route between lat/long

    - by user311374
    I have a web site that I am trying to get completed and I need to have the user click points on a map and then work out the route on the roads between the two points. So the user clicks the first point on 1st street, and then clicks another point on 4th street, and the map will find the best way to get there and plot the route on the map. I am assuming this can be done using directions and parse it up, but I have been searching for an hour now and can't find what I am looking for (maybe bad search terms). I need to be able to plot the map manually (?) so I can calculate the distance, etc... of the route as the user continues to click. The site that is in beta is http://www.RunMyRoute.com/UserRoutes/Create and you can see I am trying to create running routes. I want the user to have the option for the route to follow the roads versus just a straight line between two points on the map. Any help on this would be great! Simon.

    Read the article

  • How to get levels for Fry Graph readability formula?

    - by Vic
    Hi, I'm working in an application (C#) that applies some readability formulas to a text, like Gunning-Fog, Precise SMOG, Flesh-Kincaid. Now, I need to implement the Fry-based Grade formula in my program, I understand the formula's logic, pretty much you take 3 100-words samples and calculate the average on sentences per 100-words and syllables per 100-words, and then, you use a graph to plot the values. Here is a more detailed explanation on how this formula works. I already have the averages, but I have no idea on how can I tell my program to "go check the graph and plot the values and give me a level." I don't have to show the graph to the user, I only have to show him the level. I was thinking that maybe I can have all the values in memory, divided into levels, for example: Level 1: values whose sentence average are between 10.0 and 25+, and whose syllables average are between 108 and 132. Level 2: values whose sentence average are between 7.7 and 10.0, and .... so on But the problem is that so far, the only place in which I have found the values that define a level, are in the graph itself, and they aren't too much accurate, so if I apply the approach commented above, trying to take the values from the graph, my level estimations would be too much imprecise, thus, the Fry-based Grade will not be accurate. So, maybe any of you knows about some place where I can find exact values for the different levels of the Fry-based Grade, or maybe any of you can help me think in a way to workaround this. Thanks

    Read the article

  • Saving "heavy" figure to PDF in MATLAB - rendering problem

    - by yuk
    I generate a figure in MATLAB with lot amount of points (100000+) and want to save it into a PDF file. With zbuffer or painters renderer I've got very large and slowly opened file (over 4 Mb) - all points are in vector format. Using OpenGL renderer rasterize the figure in PDF, ok for the plot, but not good for text labels. The file size is about 150 Kb. Try this simplified code, for example: x=linspace(1,10,100000); y=sin(x)+randn(size(x)); plot(x,y,'.') set(gcf,'Renderer','zbuffer') print -dpdf -r300 testpdf_zb set(gcf,'Renderer','painters') print -dpdf -r300 testpdf_pa set(gcf,'Renderer','opengl') print -dpdf -r300 testpdf_op The actual figure is much more complex with several axes and different types of plots. Is there a way to rasterize the figure, but keep text labels as vectors? Another problem with OpenGL is that is does not work in terminal mode (-nosplash -nodesktop) under Mac OSX. Looks like OpenGL is not supported. I have to use terminal mode for automation. The MATLAB version I run is 2007b. Mac OSX server 10.4.

    Read the article

  • How to map coordinates in AxesImage to coordinates in saved image file?

    - by Vebjorn Ljosa
    I use matplotlib to display a matrix of numbers as an image, attach labels along the axes, and save the plot to a PNG file. For the purpose of creating an HTML image map, I need to know the pixel coordinates in the PNG file for a region in the image being displayed by imshow. I have found an example of how to do this with a regular plot, but when I try to do the same with imshow, the mapping is not correct. Here is my code, which saves an image and attempts to print the pixel coordinates of the center of each square on the diagonal: import numpy as np import matplotlib.pyplot as plt fig = plt.figure() ax = fig.add_axes([0.1, 0.1, 0.8, 0.8]) axim = ax.imshow(np.random.random((27,27)), interpolation='nearest') for x, y in axim.get_transform().transform(zip(range(28), range(28))): print int(x), int(fig.get_figheight() * fig.get_dpi() - y) plt.savefig('foo.png', dpi=fig.get_dpi()) Here is the resulting foo.png, shown as a screenshot in order to include the rulers: The output of the script starts and ends as follows: 73 55 92 69 111 83 130 97 149 112 … 509 382 528 396 547 410 566 424 585 439 As you see, the y-coordinates are correct, but the x-coordinates are stretched: they range from 73 to 585 instead of the expected 135 to 506, and they are spaced 19 pixels o.c. instead of the expected 14. What am I doing wrong?

    Read the article

  • Why do my CouchDB databases grow so fast?

    - by konrad
    I was wondering why my CouchDB database was growing to fast so I wrote a little test script. This script changes an attributed of a CouchDB document 1200 times and takes the size of the database after each change. After performing these 1200 writing steps the database is doing a compaction step and the db size is measured again. In the end the script plots the databases size against the revision numbers. The benchmarking is run twice: The first time the default number of document revision (=1000) is used (_revs_limit). The second time the number of document revisions is set to 1. The first run produces the following plot The second run produces this plot For me this is quite an unexpected behavior. In the first run I would have expected a linear growth as every change produces a new revision. When the 1000 revisions are reached the size value should be constant as the older revisions are discarded. After the compaction the size should fall significantly. In the second run the first revision should result in certain database size that is then keeps during the following writing steps as every new revision leads to the deletion of the previous one. I could understand if there is a little bit of overhead needed to manage the changes but this growth behavior seems weird to me. Can anybody explain this phenomenon or correct my assumptions that lead to the wrong expectations?

    Read the article

  • Calculate posterior distribution of unknown mis-classification with PRTools in MATLAB

    - by Samuel Lampa
    I'm using the PRTools MATLAB library to train some classifiers, generating test data and testing the classifiers. I have the following details: N: Total # of test examples k: # of mis-classification for each classifier and class I want to do: Calculate and plot Bayesian posterior distributions of the unknown probabilities of mis-classification (denoted q), that is, as probability density functions over q itself (so, P(q) will be plotted over q, from 0 to 1). I have that (math formulae, not matlab code!): P(q|k,N) = Posterior * Prior / Normalization constant = P(k|q,N) * P(q|N) / P(k|N) The prior is set to 1, so I only need to calculate the posterior and normalization constant. I know that the posterior can be expressed as (where B(N,k) is the binomial coefficient): P(k|q,N) = B(N,k) * q^k * (1-q)^(N-k) ... so the Normalization constant is simply an integral of the posterior above, from 0 to 1: P(k|N) = B(N,k) * integralFromZeroToOne( q^k * (1-q)^(N-k) ) (The Binomial coefficient ( B(N,k) ) can be omitted thoughappears in both the posterior and normalization constant, so it can be omitted.) Now, I've heard that the integral for the normalization constant should be able to be calculated as a series ... something like: k!(N-k)! / (N+1)! Is that correct? (I have some lecture notes from with this series, but can't figure out if it is for the normalization constant integral, or for the posterior distribution of mis-classification (q)) Also, hints are welcome as how to practically calculate this? (factorials are easily creating truncation errors right?) ... AND, how to practically calculate the final plot (the posterior distribution over q, from 0 to 1).

    Read the article

  • How would i down-sample a .wav file then reconstruct it using nyquist? - in MATLAB

    - by Andrew
    This is all done in MATLAB 2010 My objective is to show the results of: undersampling, nyquist rate/ oversampling First i need to downsample the .wav file to get an incomplete/ or impartial data stream that i can then reconstuct. Heres the flow chart of what im going to be doing So the flow is analog signal - sampling analog filter - ADC - resample down - resample up - DAC - reconstruction analog filter what needs to be achieved: F= Frequency F(Hz=1/s) E.x. 100Hz = 1000 (Cyc/sec) F(s)= 1/(2f) Example problem: 1000 hz = Highest frequency 1/2(1000hz) = 1/2000 = 5x10(-3) sec/cyc or a sampling rate of 5ms This is my first signal processing project using matlab. what i have so far. % Fs = frequency sampled (44100hz or the sampling frequency of a cd) [test,fs]=wavread('test.wav'); % loads the .wav file left=test(:,1); % Plot of the .wav signal time vs. strength time=(1/44100)*length(left); t=linspace(0,time,length(left)); plot(t,left) xlabel('time (sec)'); ylabel('relative signal strength') **%this is were i would need to sample it at the different frequecys (both above and below and at) nyquist frequency.*I think.*** soundsc(left,fs) % shows the resaultant audio file , which is the same as original ( only at or above nyquist frequency however) Can anyone tell me how to make it better, and how to do the sampling at verious frequencies?

    Read the article

  • Image processing on bifurcation diagram to get small eps size

    - by yCalleecharan
    Hello, I'm producing bifurcation diagrams (which are normally used in nonlinear dynamics). These diagrams identify abrupt changes in topologies due to stability changes. These abrupt changes occur as one or more parameters pass through some critical value(s). An example is here: http://en.wikipedia.org/wiki/File:LogisticMap_BifurcationDiagram.png On the above figure, some image processing has been done so as to make the plot more visually pleasant. A bifurcation diagram usually contains hundreds of thousands of points and the resulting eps file can become very big. Journal submission in the LaTeX format require that figures are to be submitted in the eps format. In my case one of such figures can result in about 6 MB in Matlab and even much more in Gnuplot. For the example in the above figure, 100,000 x values are calculated for each r and one can imagine that the resulting eps file would be huge. The site however explains some image processing that makes the plot more visually pleasing. Can anyone explain to me stepwise how go about? I can't understand the explanation provided in the "summary" section. Will the resulting image processing also reduce the figure size? Furthermore, any tips on reducing the file size of such a huge eps figure? Thanks a lot...

    Read the article

  • Saving "heavy" image to PDF in MATLAB - rendering problem

    - by yuk
    I generate a figure in MATLAB with lot amount of points (100000+) and want to save it into a PDF file. With zbuffer or painters renderer I've got very large and slowly opened file (over 4 Mb) - all points are in vector format. Using OpenGL renderer rasterize the figure in PDF, ok for the plot, but not good for text labels. The file size is about 150 Kb. Try this simplified code, for example: x=linspace(1,10,100000); y=sin(x)+randn(size(x)); plot(x,y,'.') set(gcf,'Renderer','zbuffer') print -dpdf -r300 testpdf_zb set(gcf,'Renderer','painters') print -dpdf -r300 testpdf_pa set(gcf,'Renderer','opengl') print -dpdf -r300 testpdf_op The actual figure is much more complex with several axes and different types of plots. Is there a way to rasterize the figure, but keep text labels as vectors? Another problem with OpenGL is that is does not work in terminal mode (-nosplash -nodesktop) under Mac OSX. Looks like OpenGL is not supported. I have to use terminal mode for automation. The MATLAB version I run is 2007b. Mac OSX server 10.4.

    Read the article

  • multi-core processing in R on windows XP - via doMC and foreach

    - by Jan
    Hi guys, I'm posting this question to ask for advice on how to optimize the use of multiple processors from R on a Windows XP machine. At the moment I'm creating 4 scripts (each script with e.g. for (i in 1:100) and (i in 101:200), etc) which I run in 4 different R sessions at the same time. This seems to use all the available cpu. I however would like to do this a bit more efficient. One solution could be to use the "doMC" and the "foreach" package but this is not possible in R on a Windows machine. e.g. library("foreach") library("strucchange") library("doMC") # would this be possible on a windows machine? registerDoMC(2) # for a computer with two cores (processors) ## Nile data with one breakpoint: the annual flows drop in 1898 ## because the first Ashwan dam was built data("Nile") plot(Nile) ## F statistics indicate one breakpoint fs.nile <- Fstats(Nile ~ 1) plot(fs.nile) breakpoints(fs.nile) # , hpc = "foreach" --> It would be great to test this. lines(breakpoints(fs.nile)) Any solutions or advice? Thanks, Jan

    Read the article

  • How to bind to another event after ajax call in jquery

    - by robert
    Hi, I'm creating a graph using flot javascript library. I have enabled clickable event and the div is binded to plotclick event. So, when a datapoint is clicked, an ajax call is made and the result is used to replace the current div. After this i need to bind the div to a new event. I tried to call bind, but it is still bound to old callback. When i call unbind and bind, the new callback is not called. var handleTeacherClick = function( event, pos, item ) { if( typeof item != "undefined" && item ) { var user_id = jQuery( 'input[name="id' + item.datapoint[0] + '"]' ).val(); jQuery.ajax({ type: 'POST', url: BASEPATH + 'index.php/ajax/home/latest', data: { "user_id": user_id }, dataType: 'json', success: function ( result ) { jQuery.plot(jQuery('#stats_prog'), result.progress_data, result.progress_options); jQuery.plot(jQuery('#stats_perf'), result.performance_data, result.performance_options); jQuery('.stats_title'). html('<span class="stats_title">'+ ' >> Chapter '+Math.ceil(item.datapoint[0])+'</span>'); jQuery('#stats_prog')./*unbind("plotclick").*/ bind('plotclick', statClickHandler ); jQuery('#stats_perf')./*unbind("plotclick"). */ bind( 'plotclick', statClickHandler ); }, }); } }

    Read the article

  • Graphing the pitch (frequency) of a sound

    - by Coronatus
    I want to plot the pitch of a sound into a graph. Currently I can plot the amplitude. The graph below is created by the data returned by getUnscaledAmplitude(): AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(new BufferedInputStream(new FileInputStream(file))); byte[] bytes = new byte[(int) (audioInputStream.getFrameLength()) * (audioInputStream.getFormat().getFrameSize())]; audioInputStream.read(bytes); // Get amplitude values for each audio channel in an array. graphData = type.getUnscaledAmplitude(bytes, this); public int[][] getUnscaledAmplitude(byte[] eightBitByteArray, AudioInfo audioInfo) { int[][] toReturn = new int[audioInfo.getNumberOfChannels()][eightBitByteArray.length / (2 * audioInfo. getNumberOfChannels())]; int index = 0; for (int audioByte = 0; audioByte < eightBitByteArray.length;) { for (int channel = 0; channel < audioInfo.getNumberOfChannels(); channel++) { // Do the byte to sample conversion. int low = (int) eightBitByteArray[audioByte]; audioByte++; int high = (int) eightBitByteArray[audioByte]; audioByte++; int sample = (high << 8) + (low & 0x00ff); if (sample < audioInfo.sampleMin) { audioInfo.sampleMin = sample; } else if (sample > audioInfo.sampleMax) { audioInfo.sampleMax = sample; } toReturn[channel][index] = sample; } index++; } return toReturn; } But I need to show the audio's pitch, not amplitude. Fast Fourier transform appears to get the pitch, but it needs to know more variables than the raw bytes I have, and is very complex and mathematical. Is there a way I can do this?

    Read the article

  • Transform only one axis to log10 scale with ggplot2

    - by daroczig
    I have the following problem: I would like to visualize a discrete and a continuous variable on a boxplot in which the latter has a few extreme high values. This makes the boxplot meaningless (the points and even the "body" of the chart is too small), that is why I would like to show this on a log10 scale. I am aware that I could leave out the extreme values from the visualization, but I am not intended to. Let's see a simple example with diamonds data: m <- ggplot(diamonds, aes(y = price, x = color)) The problem is not serious here, but I hope you could imagine why I would like to see the values at a log10 scale. Let's try it: m + geom_boxplot() + coord_trans(y = "log10") As you can see the y axis is log10 scaled and looks fine but there is a problem with the x axis, which makes the plot very strange. The problem do not occur with scale_log, but this is not an option for me, as I cannot use a custom formatter this way. E.g.: m + geom_boxplot() + scale_y_log10() My question: does anyone know a solution to plot the boxplot with log10 scale on y axis which labels could be freely formatted with a formatter function like in this thread? Editing the question to help answerers based on answers and comments: What I am really after: one log10 transformed axis (y) with not scientific labels. I would like to label it like dollar (formatter=dollar) or any custom format. If I try @hadley's suggestion I get the following warnings: > m + geom_boxplot() + scale_y_log10(formatter=dollar) Warning messages: 1: In max(x) : no non-missing arguments to max; returning -Inf 2: In max(x) : no non-missing arguments to max; returning -Inf 3: In max(x) : no non-missing arguments to max; returning -Inf With an unchanged y axis labels:

    Read the article

  • [C#] Three System.Drawing methods manifest slow drawing or flickery: Solutions? or Other Options?

    - by Luke Mcneice
    Hi all, I am doing a little graphing via the System.Drawing and im having a few problems. I'm holding data in a Queue and i'm drawing(graphing) out that data onto three picture boxes this method fills the picture box then scrolls the graph across. so not to draw on top of the previous drawings (and graduly looking messier) i found 2 solutions to draw the graph. Call plot.Clear(BACKGOUNDCOLOR) before the draw loop [block commented] although this causes a flicker to appear from the time it takes to do the actual drawing loop. call plot.DrawLine(channelPen[5], j, 140, j, 0); just before each drawline [commented] although this causes the drawing to start ok then slow down very quickly to a crawl as if a wait command had been placed before the draw command. Here is the Code for reference: /*plotx.Clear(BACKGOUNDCOLOR) ploty.Clear(BACKGOUNDCOLOR) plotz.Clear(BACKGOUNDCOLOR)*/ for (int j = 1; j < 599; j++) { if (j > RealTimeBuffer.Count - 1) break; QueueEntity past = RealTimeBuffer.ElementAt(j - 1); QueueEntity current = RealTimeBuffer.ElementAt(j); if (j == 1) { //plotx.DrawLine(channelPen[5], 0, 140, 0, 0); //ploty.DrawLine(channelPen[5], 0, 140, 0, 0); //plotz.DrawLine(channelPen[5], 0, 140, 0, 0); } //plotx.DrawLine(channelPen[5], j, 140, j, 0); plotx.DrawLine(channelPen[0], j - 1, (((past.accdata.X - 0x7FFF) / 256) + 64), j, (((current.accdata.X - 0x7FFF) / 256) + 64)); //ploty.DrawLine(channelPen[5], j, 140, j, 0); ploty.DrawLine(channelPen[1], j - 1, (((past.accdata.Y - 0x7FFF) / 256) + 64), j, (((current.accdata.Y - 0x7FFF) / 256) + 64)); //plotz.DrawLine(markerPen, j, 140, j, 0); plotz.DrawLine(channelPen[2], j - 1, (((past.accdata.Z - 0x7FFF) / 256) + 94), j, (((current.accdata.Z - 0x7FFF) / 256) + 94)); } Is there any tricks to avoid these overheads? If not would there be any other/better solutions?

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >