Search Results

Search found 4778 results on 192 pages for 'asha series'.

Page 49/192 | < Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >

  • Pass data in np.dnarray to Highcharts

    - by F.N.B
    I'm working with python 2.7, jinja2, flask and Highcharts. I create two numpy array (x1 and x2, type = numpy.dnarray) and I pass to Highcharts. My problems is, Highcharts don't recognize the commas in the vector. This is my jinja2 code: <script> $(function () { $('#container').highcharts({ series: [{ name: 'Tokyo', data: {{ x1 }} }, { name: 'London', data: {{ x2 }} }] }); }); And this is the error that I look with network chrome dev tools: series: [{ name: 'Tokyo', data: [1 4 5 2 3] }, { name: 'London', data: [3 6 7 4 1] }] I need change the numpy array to python list to pass to Highcharts or there is a better way to do?? Thanks

    Read the article

  • Tools for displaying text, powerpoint style, in linux

    - by Will Mc
    I have a problem where I need a way to display a repeating series of "images" on a computer monitor. Specifically, given a series of text files, I'd like a way to display the contents of said files on a screen in a way much like a powerpoint would. My current thoughts are to find some tool that will take in a text file of some format, and then output an image which contains the text from the file. Then I'd put it in a directory and have some Slideshow program continuously go between the images in that directory. It's a very hacky solution, obviously. So, does anyone know of tools that would do such a thing? Or is there a better way to do this? I've looked into the library libgd2, but it doesn't seem to support text-wrapping for images, which is something I'd need. Thanks!

    Read the article

  • How to change the range of a chart in Excel using VBA?

    - by Pieter
    Hi guys, I'm using an Excel sheet to keep track of a particular time series (my daily weight, if you must know). I created a macro that inserts a row, automatically adds today's date and calculates a moving average based on my input. There is also a chart to visualize my progress. I have tried recording a macro that updates the time series in the graph, but to no success. How can I create a macro or VBA script that, when executed, updates the range of the graph from A(x):Cy to A(x-1):Cy to include today's measurement? Thanks!

    Read the article

  • Spreatsheet:WriteExcel create Chart

    - by yaohung
    Hi, I used csv2xls.pl to convert a text log into .xls file, and then apply create chart function as following: my $chart3 = $workbook-add_chart( type = 'line' , embedded = 1); Configure the series. $chart3-add_series( categories = '=Sheet1!$B$2:$B$64', values = '=Sheet1!$C$2:$C$64', name = 'Test data series 1', ); Add some labels. $chart3-set_title( name = 'Bridge Rate Analysis' ); $chart3-set_x_axis( name = 'Packet Size ' ); $chart3-set_y_axis( name = 'BVI Rate' ); Insert the chart into the main worksheet. $worksheet-insert_chart( 'G2', $chart3 ); ========== I can see the chart in .xls file, however, all the data is in text format, not number, therefore, the chart looks wrong. I am wondering can you tell me how to convert text into number before apply this create chart function? One other thing is any idea how to apply sorting on the .xls file before create chart? Thanks. Yaohung

    Read the article

  • how do i extract the last value of every output in c# ?

    - by Prithvi
    Here is the contents of my text file which consists of a name like "output1" followed by a series of numbers, "output2" followed by a series of numbers and so on separated by a single space...I want to extract last value of every output. How do i do this? ie., I want to extract the value of "output1" as 29.933215(ie. the value just before "output2") and the values of "output2" as 379.983559 content of text file is as shown below: output1 30.000000 29.999969 29.999907 29.999783 29.999535 29.999039 29.998047 29.996063 29.992133 29.988202 29.984273 29.980343 29.976414 29.972485 29.968556 29.964628 29.960700 29.956773 29.952846 29.948919 29.944992 29.941066 29.937140 29.933215 output2 380.000000 379.999990 379.999970 379.999930 379.999850 379.999691 379.999372 379.998734 379.997469 379.996204 379.994940 379.993675 379.992411 379.991146 379.989882 379.988617 379.987353 379.986088 379.984824 379.983559 output3 1.761964 1.761964 1.761964 1.761964 1.761964 1.761964 1.761964 1.761964 1.761963 1.761963 1.761963 1.761963 1.761963 1.761963 1.761963 1.761963 1.761962 1.761962 1.761962 1.761962 1.761961 1.761961 1.761961 1.761960

    Read the article

  • Flex 1009 error Chart datatip

    - by JMcNally
    I am simply trying to hide a datatip on a chart if the hit data item (yfield item) is == 101 for two differnt series an areaseries and a line series. The following funciton works in IE but throws an error in firefox. I am new to flex and was wondering if there is a better way to approch this.. the code in // is causing the error. public function datatip_M_chart(e:HitData):String { //if (e.item.STUB == 101) {e.chartItem.itemRenderer.alpha = 0}// if (e.chartItem is AreaSeriesItem){ return'<b>'+COMBO.selectedLabel+'</b>'+ '<br/><i>Age: </i>' + e.item.STUB + '<br/><i>Percentage:</i> ' + e.item.M_PROP+'%'; }else{ return'<b>'+COMBO2.selectedLabel+'</b>'+ '<br/><i>Age: </i>' + e.item.STUB + '<br/><i>Percentage:</i> ' + e.item.M_PROP+'%'; }

    Read the article

  • Blackberry wifi connection issue

    - by AmitG
    Hi all, I have a very strange issue with my blackberry application. My application first searches for a blackberry network ( ;interface=wifi , ;deviceside=true , ;deviceside=false" + ";ConnectionUID="+ myRecord.getUid() ) transport and after finding one, it uses it to connect the server application. The issue is that some of my web services consumed by the bb client applicaton does not work when the connection type is wifi. Example: I have a series of web services called one after the other, login , x , y ,z, and logout. The connection transport selected = WIFI Login and LogOfff works fine but x, y and z web services return me an error message. The same series of web services ( Login, x,y,z, Log Out,) works with other connection transports ( eg. ap2trans) . Please help me with this issue as connecting to wifi ( if available ) is important for the application. Pleaes provide even the smallest info in this regad. thanks in advance

    Read the article

  • Using ggplot in R

    - by g256
    I have a time series cvs file like this (in var there are the values): Hour,Date,Type,val1,val2,val3,val4,val5,val6..... 0100,0153,aaa, 0100,0153,bbb, 0100,0153,ccc, 0100,0153,ddd, 0200,0153,aaa, 0200,0153,bbb, 0200,0153,ccc, 0200,0153,ddd, . . 0100,0154,aaa, 0100,0154,bbb, 0100,0154,ccc, 0100,0154,ddd, . . I would like to use ggplot to plot the barplot mean of the time series of val1,val2,... for each type. One plot for each type except for ccc and ddd that should be considered as one type, so for this I should first sum up the value, then take the mean, then plot). So three plot at the end. Thanks for advice.

    Read the article

  • R software : How to extract values from rasterstack with xy coordinates?

    - by Eddie
    I have a rasterstack(5 raster layers) that actually is a time series raster. r <- raster(nrow=20, ncol=200) s <- stack( sapply(1:5, function(i) setValues(r, rnorm(ncell(r), i, 3) )) ) > s class : RasterStack dimensions : 20, 200, 4000, 5 (nrow, ncol, ncell, nlayers) resolution : 1.8, 9 (x, y) extent : -180, 180, -90, 90 (xmin, xmax, ymin, ymax) coord. ref. : +proj=longlat +datum=WGS84 +ellps=WGS84 +towgs84=0,0,0 names : layer.1, layer.2, layer.3, layer.4, layer.5 min values : -9.012146, -9.165947, -9.707269, -7.829763, -5.332007 max values : 11.32811, 11.97328, 15.99459, 15.66769, 16.72236 My objective is to plot each pixel and explore their behavior over time. How could I extract each pixels together with their x,y coordinates and plot a time series curve?

    Read the article

  • Count Records in Listing View

    - by 47
    I have these two models: class CommonVehicle(models.Model): year = models.ForeignKey(Year) series = models.ForeignKey(Series) engine = models.ForeignKey(Engine) body_style = models.ForeignKey(BodyStyle) ... class Vehicle(models.Model): objects = VehicleManager() stock_number = models.CharField(max_length=6, blank=False) vin = models.CharField(max_length=17, blank=False) common_vehicle = models.ForeignKey(CommonVehicle) .... What I want to do is to have a count of how many times a given CommonVehicle object is used in the Vehicle class. So far my attempts are giving me one number, which is a total of all the records. How can I have the count being the total appearances for each CommonVehicle

    Read the article

  • Iterating over a large data set in long running Python process - memory issues?

    - by user1094786
    I am working on a long running Python program (a part of it is a Flask API, and the other realtime data fetcher). Both my long running processes iterate, quite often (the API one might even do so hundreds of times a second) over large data sets (second by second observations of certain economic series, for example 1-5MB worth of data or even more). They also interpolate, compare and do calculations between series etc. What techniques, for the sake of keeping my processes alive, can I practice when iterating / passing as parameters / processing these large data sets? For instance, should I use the gc module and collect manually? Any advice would be appreciated. Thanks!

    Read the article

  • How to get an item value of json using C#?

    - by user3487837
    How to get an item value of json using C#? json: [{ ID: '6512', fd: [{ titie: 'Graph-01', type: 'graph', views: { graph: { show: true, state: { group: 'DivisionName', series: ['FieldWeight', 'FactoryWeight', 'Variance'], graphType: 'lines-and-points' } } } }, { titie: 'Graph-02', type: 'Graph', views: { graph: { show: true, state: { group: 'DivisionName', series: ['FieldWeight', 'FactoryWeight', 'Variance'], graphType: 'lines-and-points' } } } }] }, { ID: '6506', fd: [{ titie: 'Map-01', type: 'map', views: { map: { show: true, state: { kpiField: 'P_BudgetAmount', kpiSlabs: [{ id: 'P_BudgetAmount', hues: ['#0fff03', '#eb0707'], scales: '10' }] } } } }] }] Above mentioned one is json, Here titie value will be get in a list please help me... my code is: string dashletsConfigPath = Url.Content("~/Content/Dashlets/Dashlets.json"); string jArray = System.IO.File.ReadAllText(Server.MapPath(dashletsConfigPath)) List<string> lists = new List<string>(); JArray list = JArray.Parse(jArray); var ll = list.Select(j => j["dashlets"]).ToList();

    Read the article

  • Add Recaptcha and GridView to an ASP.NET 3.5 Guestbook using MS SQL Server and VB.NET

    This is the conclusion to a four-part ASP.NET 3.5 guest book application tutorial series. In this last part you will learn how to integrate Recaptcha which is used to prevent spam automatic bot submission. Also to be discussed is how to add a GridView web control which is used to display all guest book comments retrieved from the database.... Download a Free Trial of Windows 7 Reduce Management Costs and Improve Productivity with Windows 7

    Read the article

  • Using a service registry that doesn’t suck Part III: Service testing is part of SOA governance

    - by gsusx
    This is the third post of this series intended to highlight some of the principles of modern SOA governance solution. You can read the first two parts here: Using a service registry that doesn’t suck part I: UDDI is dead Using a service registry that doesn’t suck part II: Dear registry, do you have to be a message broker? This time I’ve decided to focus on what of the aspects that drives me ABSOLUTELY INSANE about traditional SOA Governance solutions: service testing or I should I say the lack of...(read more)

    Read the article

  • Using a service registry that doesn’t suck part II: Dear registry, do you have to be a message broker?

    - by gsusx
    Continuing our series of posts about service registry patterns that suck, we decided to address one of the most common techniques that Service Oriented (SOA) governance tools use to enforce policies. Scenario Service registries and repositories serve typically as a mechanism for storing service policies that model behaviors such as security, trust, reliable messaging, SLAs, etc. This makes perfect sense given that SOA governance registries were conceived as a mechanism to store and manage the policies...(read more)

    Read the article

  • Reverse-engineer SharePoint fields, content types and list instance—Part2

    - by ybbest
    Reverse-engineer SharePoint fields, content types and list instance—Part1 Reverse-engineer SharePoint fields, content types and list instance—Part2 In the part1 of this series, I demonstrated how to use VS2010 to Reverse-engineer SharePoint fields, content types and list instances. In the part 2 of this series, I will demonstrate how to do the same using CKS:Dev. CKS:Dev extends the Visual Studio 2010 SharePoint project system with advanced templates and tools. Using these extensions you will be able to find relevant information from your SharePoint environments without leaving Visual Studio. You will have greater productivity while developing SharePoint components and you will have greater deployment capabilities on your local SharePoint installation. You can download the complete solution here. 1. First, download and install appropriate CKS:Dev from CodePlex. If you are using SharePoint Foundation 2010 then download and install the SharePoint Foundation 2010 version If you are using SharePoint Server 2010 then download and install the SharePoint Server 2010 version 2. After installation, you need to restart your visual studio and create empty SharePoint. 3. Go to Viewà Server Explorer 4. Add SharePoint web application connection to the server explorer. 5. After add the connection, you can browse to see the contents for the Web Application. 6. Go to Site Columns à YBBEST (Custom Group of you own choice) and right-click the YBBEST Folder and Click Import Site Columns. 7. Go to ContentTypesà YBBEST (Custom Group of you own choice) and right-click the YBBEST Folder and Click Import Content Types. 8. After the import completes, you can find the fields and contentTypes in the SharePoint project below. Of course you need to do some modification to your current project to make it work. 9. Next, create list instances using list instance item template in Visual Studio 10. Finally, create lookup columns using the feature receivers and the final project will look like this. You can download the complete solution here.

    Read the article

  • Database Trends & Applications column: Database Benchmarking from A to Z

    - by KKline
    Have you heard of the monthly print and web magazine Database Trends & Applications (DBTA)? Did you know I'm the regular columnist covering SQL Server ? For the past six months, I've been writing a series of articles about database benchmarking culminating in the latest article discussing my three favorite database benchmarking tools: the free, open-source HammerDB, the native SQL Server Distributed Replay Utility, and the commercial Benchmark Factory from Dell / Quest Software. Wondering what...(read more)

    Read the article

  • More information on the Patch Tuesday updates for SQL Server

    - by AaronBertrand
    Last week, Microsoft released a series of patches for all supported versions of SQL Server (from SQL Server 2005 SP3 all the way to SQL Server 2008 R2). The reason for the patch against SQL Server installations is largely a client-side issue with the XML viewer application, and for SQL Server specifically, the exploit is limited to potential information disclosure. A very easy way to avoid exposure to this exploit is simply to never open a file with the .disco extension (these files are likely already...(read more)

    Read the article

  • Oracle Data Mining a Star Schema: Telco Churn Case Study

    - by charlie.berger
    There is a complete and detailed Telco Churn case study "How to" Blog Series just posted by Ari Mozes, ODM Dev. Manager.  In it, Ari provides detailed guidance in how to leverage various strengths of Oracle Data Mining including the ability to: mine Star Schemas and join tables and views together to obtain a complete 360 degree view of a customer combine transactional data e.g. call record detail (CDR) data, etc. define complex data transformation, model build and model deploy analytical methodologies inside the Database  His blog is posted in a multi-part series.  Below are some opening excerpts for the first 3 blog entries.  This is an excellent resource for any novice to skilled data miner who wants to gain competitive advantage by mining their data inside the Oracle Database.  Many thanks Ari! Mining a Star Schema: Telco Churn Case Study (1 of 3) One of the strengths of Oracle Data Mining is the ability to mine star schemas with minimal effort.  Star schemas are commonly used in relational databases, and they often contain rich data with interesting patterns.  While dimension tables may contain interesting demographics, fact tables will often contain user behavior, such as phone usage or purchase patterns.  Both of these aspects - demographics and usage patterns - can provide insight into behavior.Churn is a critical problem in the telecommunications industry, and companies go to great lengths to reduce the churn of their customer base.  One case study1 describes a telecommunications scenario involving understanding, and identification of, churn, where the underlying data is present in a star schema.  That case study is a good example for demonstrating just how natural it is for Oracle Data Mining to analyze a star schema, so it will be used as the basis for this series of posts...... Mining a Star Schema: Telco Churn Case Study (2 of 3) This post will follow the transformation steps as described in the case study, but will use Oracle SQL as the means for preparing data.  Please see the previous post for background material, including links to the case study and to scripts that can be used to replicate the stages in these posts.1) Handling missing values for call data recordsThe CDR_T table records the number of phone minutes used by a customer per month and per call type (tariff).  For example, the table may contain one record corresponding to the number of peak (call type) minutes in January for a specific customer, and another record associated with international calls in March for the same customer.  This table is likely to be fairly dense (most type-month combinations for a given customer will be present) due to the coarse level of aggregation, but there may be some missing values.  Missing entries may occur for a number of reasons: the customer made no calls of a particular type in a particular month, the customer switched providers during the timeframe, or perhaps there is a data entry problem.  In the first situation, the correct interpretation of a missing entry would be to assume that the number of minutes for the type-month combination is zero.  In the other situations, it is not appropriate to assume zero, but rather derive some representative value to replace the missing entries.  The referenced case study takes the latter approach.  The data is segmented by customer and call type, and within a given customer-call type combination, an average number of minutes is computed and used as a replacement value.In SQL, we need to generate additional rows for the missing entries and populate those rows with appropriate values.  To generate the missing rows, Oracle's partition outer join feature is a perfect fit.  select cust_id, cdre.tariff, cdre.month, minsfrom cdr_t cdr partition by (cust_id) right outer join     (select distinct tariff, month from cdr_t) cdre     on (cdr.month = cdre.month and cdr.tariff = cdre.tariff);   ....... Mining a Star Schema: Telco Churn Case Study (3 of 3) Now that the "difficult" work is complete - preparing the data - we can move to building a predictive model to help identify and understand churn.The case study suggests that separate models be built for different customer segments (high, medium, low, and very low value customer groups).  To reduce the data to a single segment, a filter can be applied: create or replace view churn_data_high asselect * from churn_prep where value_band = 'HIGH'; It is simple to take a quick look at the predictive aspects of the data on a univariate basis.  While this does not capture the more complex multi-variate effects as would occur with the full-blown data mining algorithms, it can give a quick feel as to the predictive aspects of the data as well as validate the data preparation steps.  Oracle Data Mining includes a predictive analytics package which enables quick analysis. begin  dbms_predictive_analytics.explain(   'churn_data_high','churn_m6','expl_churn_tab'); end; /select * from expl_churn_tab where rank <= 5 order by rank; ATTRIBUTE_NAME       ATTRIBUTE_SUBNAME EXPLANATORY_VALUE RANK-------------------- ----------------- ----------------- ----------LOS_BAND                                      .069167052          1MINS_PER_TARIFF_MON  PEAK-5                   .034881648          2REV_PER_MON          REV-5                    .034527798          3DROPPED_CALLS                                 .028110322          4MINS_PER_TARIFF_MON  PEAK-4                   .024698149          5From the above results, it is clear that some predictors do contain information to help identify churn (explanatory value > 0).  The strongest uni-variate predictor of churn appears to be the customer's (binned) length of service.  The second strongest churn indicator appears to be the number of peak minutes used in the most recent month.  The subname column contains the interior piece of the DM_NESTED_NUMERICALS column described in the previous post.  By using the object relational approach, many related predictors are included within a single top-level column. .....   NOTE:  These are just EXCERPTS.  Click here to start reading the Oracle Data Mining a Star Schema: Telco Churn Case Study from the beginning.    

    Read the article

< Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >