Search Results

Search found 9 results on 1 pages for 'loess'.

Page 1/1 | 1 

  • science.js’s loess() output is identical to input

    - by user3710111
    Rendered project available here. The line is supposed to be a trend line (as rendered with LOESS), but it merely follows each data point instead. I am no stats wonk, so maybe it makes sense that a LOESS function’s output would match the input as seen in the above example, but it strikes me as being wrong. Here is the relevant bit of code: var loess = science.stats.loess().bandwidth(.2); var xVal = data.map(function(d) { return d.date; }); var yVal = data.map(function(d) { return d.A; }); var loessData = loess([xVal], [yVal])[0]; console.log(yVal); console.log(loessData);

    Read the article

  • Error using `loess.smooth` but not `loess` or `lowess`

    - by Sandy
    I need to smooth some simulated data, but occasionally run into problems when the simulated ordinates to be smoothed are mostly the same value. Here is a small reproducible example of the simplest case. > x <- 0:50 > y <- rep(0,51) > loess.smooth(x,y) Error in simpleLoess(y, x, w, span, degree, FALSE, FALSE, normalize = FALSE, : NA/NaN/Inf in foreign function call (arg 1) loess(y~x), lowess(x,y), and their analogue in MATLAB produce the expected results without error on this example. I am using loess.smooth here because I need the estimates evaluated at a set number of points. According to the documentation, I believe loess.smooth and loess are using the same estimation functions, but the former is an "auxiliary function" to handle the evaluation points. The error seems to come from a C function: > traceback() 3: .C(R_loess_raw, as.double(pseudovalues), as.double(x), as.double(weights), as.double(weights), as.integer(D), as.integer(N), as.double(span), as.integer(degree), as.integer(nonparametric), as.integer(order.drop.sqr), as.integer(sum.drop.sqr), as.double(span * cell), as.character(surf.stat), temp = double(N), parameter = integer(7), a = integer(max.kd), xi = double(max.kd), vert = double(2 * D), vval = double((D + 1) * max.kd), diagonal = double(N), trL = double(1), delta1 = double(1), delta2 = double(1), as.integer(0L)) 2: simpleLoess(y, x, w, span, degree, FALSE, FALSE, normalize = FALSE, "none", "interpolate", control$cell, iterations, control$trace.hat) 1: loess.smooth(x, y) loess also calls simpleLoess, but with what appears to be different arguments. Of course, if you vary enough of the y values to be nonzero, loess.smooth runs without error, but I need the program to run in even the most extreme case. Hopefully, someone can help me with one and/or all of the following: Understand why only loess.smooth, and not the other functions, produces this error and find a solution for this problem. Find a work-around using loess but still evaluating the estimate at a specified number of points that can differ from the vector x. For example, I might want to use only x <- seq(0,50,10) in the smoothing, but evaluate the estimate at x <- 0:50. As far as I know, using predict with a new data frame will not properly handle this situation, but please let me know if I am missing something there. Handle the error in a way that doesn't stop the program from moving onto the next simulated data set. Thanks in advance for any help on this problem.

    Read the article

  • program R- in ggplot restrict y to be >0 in LOESS plot

    - by Nate
    Here's my code: qplot(data=sites, x, y, main="Site 349") (p <- qplot(data = sites, x, y, xlab = "", ylab = "")) (p1 <- p + geom_smooth(method = "loess",span=0.5, size = 1.5)) p1 + theme_bw() + opts(title = "Site 349") Some of the LOESS lines and confidence intervals go below zero, but I would like to restrict the graphics to 0 and positive numbers (because negative do not make sense). How can I do this?

    Read the article

  • How can I superimpose modified loess lines on a ggplot2 qplot?

    - by briandk
    Background Right now, I'm creating a multiple-predictor linear model and generating diagnostic plots to assess regression assumptions. (It's for a multiple regression analysis stats class that I'm loving at the moment :-) My textbook (Cohen, Cohen, West, and Aiken 2003) recommends plotting each predictor against the residuals to make sure that: The residuals don't systematically covary with the predictor The residuals are homoscedastic with respect to each predictor in the model On point (2), my textbook has this to say: Some statistical packages allow the analyst to plot lowess fit lines at the mean of the residuals (0-line), 1 standard deviation above the mean, and 1 standard deviation below the mean of the residuals....In the present case {their example}, the two lines {mean + 1sd and mean - 1sd} remain roughly parallel to the lowess {0} line, consistent with the interpretation that the variance of the residuals does not change as a function of X. (p. 131) How can I modify loess lines? I know how to generate a scatterplot with a "0-line,": # First, I'll make a simple linear model and get its diagnostic stats library(ggplot2) data(cars) mod <- fortify(lm(speed ~ dist, data = cars)) attach(mod) str(mod) # Now I want to make sure the residuals are homoscedastic qplot (x = dist, y = .resid, data = mod) + geom_smooth(se = FALSE) # "se = FALSE" Removes the standard error bands But does anyone know how I can use ggplot2 and qplot to generate plots where the 0-line, "mean + 1sd" AND "mean - 1sd" lines would be superimposed? Is that a weird/complex question to be asking?

    Read the article

  • How do I increase the number of evaluation points in geom_smooth for ggplot2 in R

    - by Halpo
    I'm creating a plot and adding a basic loess smooth line to it. qplot(Age.GTS2004., X.d18O,data=deepsea, geom=c('point')) + geom_smooth(method="loess",se=T,span=0.01, alpha=.5, fill='light blue',color='navy') The problem is that the line is coming out really choppy. I need more evaluation point for the curve in certain areas. Is there a way to increase the number of evaluation points without having to reconstruct geom_smooth?

    Read the article

  • how to save a fitted R model for later use

    - by ahala
    Sorry for this novice question: if I fit a lm() model or loess() model, and save the model somewhere in a file or in database, for later using by third party with predict() method, do I have to save the entire model object? Since returned model object contains orginal raw data, this returned object can be huge.

    Read the article

  • In R, how do you get the best fitting equation to a set of data?

    - by Matherion
    I'm not sure wether R can do this (I assume it can, but maybe that's just because I tend to assume that R can do anything :-)). What I need is to find the best fitting equation to describe a dataset. For example, if you have these points: df = data.frame(x = c(1, 5, 10, 25, 50, 100), y = c(100, 75, 50, 40, 30, 25)) How do you get the best fitting equation? I know that you can get the best fitting curve with: plot(loess(df$y ~ df$x)) But as I understood you can't extract the equation, see Loess Fit and Resulting Equation. When I try to build it myself (note, I'm not a mathematician, so this is probably not the ideal approach :-)), I end up with smth like: y.predicted = 12.71 + ( 95 / (( (1 + df$x) ^ .5 ) / 1.3)) Which kind of seems to approximate it - but I can't help to think that smth more elegant probably exists :-) I have the feeling that fitting a linear or polynomial model also wouldn't work, because the formula seems different from what those models generally use (i.e. this one seems to need divisions, powers, etc). For example, the approach in Fitting polynomial model to data in R gives pretty bad approximations. I remember from a long time ago that there exist languages (Matlab may be one of them?) that do this kind of stuff. Can R do this as well, or am I just at the wrong place? (Background info: basically, what we need to do is find an equation for determining numbers in the second column based on the numbers in the first column; but we decide the numbers ourselves. We have an idea of how we want the curve to look like, but we can adjust these numbers to an equation if we get a better fit. It's about the pricing for a product (a cheaper alternative to current expensive software for qualitative data analysis); the more 'project credits' you buy, the cheaper it should become. Rather than forcing people to buy a given number (i.e. 5 or 10 or 25), it would be nicer to have a formula so people can buy exactly what they need - but of course this requires a formula. We have an idea for some prices we think are ok, but now we need to translate this into an equation.

    Read the article

  • Smooth Error in qplot from ggplot2

    - by Jared
    I have some data that I am trying to plot faceted by its Type with a smooth (Loess, LM, whatever) superimposed. Generation code is below: testFrame <- data.frame(Time=sample(20:60,50,replace=T),Dollars=round(runif(50,0,6)),Type=sample(c("First","Second","Third","Fourth"),50,replace=T,prob=c(.33,.01,.33,.33))) I have no problem either making a faceted plot, or plotting the smooth, but I cannnot do both. The first three lines of code below work fine. The fourth line is where I have trouble: qplot(Time,Dollars,data=testFrame,colour=Type) qplot(Time,Dollars,data=testFrame,colour=Type) + geom_smooth() qplot(Time,Dollars,data=testFrame) + facet_wrap(~Type) qplot(Time,Dollars,data=testFrame) + facet_wrap(~Type) + geom_smooth() It gives the following error: Error in [<-.data.frame(*tmp*, var, value = list(NA = NULL)) : missing values are not allowed in subscripted assignments of data frames What am I missing to overlay a smooth in a faceted plot? I could have sworn I had done this before, possibly even with the same data.

    Read the article

1