Search Results

Search found 115 results on 5 pages for 'histogram'.

Page 4/5 | < Previous Page | 1 2 3 4 5  | Next Page >

  • How To Remotely Copy Files Over SSH Without Entering Your Password

    - by YatriTrivedi
    SSH is a life-saver when you need to remotely manage a computer, but did you know you can also upload and download files, too? Using SSH keys, you can skip having to enter passwords and use this for scripts! This process works on Linux and Mac OS, provided that they’re properly configured for SSH access. If you’re using Windows, you can use Cygwin to get Linux-like functionality, and with a little tweaking, SSH will run as well.What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • OpenCV/EmguCV face recognition

    - by Meko
    Hi .I am tying to make app that detect face and recognize it. I made Face detection but I want some idea to when making recognition. I using web cam for tracking and It can recognize face.Then I am taking only part of face to an new gray image and comparing it using EigenObjectRecognizer with list of images in database.But it is not giving good result.Some times find some thing wrong,some times nothing.I want to ask that for comparing photos which additional techniques i must implement?Like Histogram equalization or resolution of faces equalization ?

    Read the article

  • Stata - Multiple rotated plots on graph (including distributions on sides of axes)

    - by meerak
    I would like to produce a single graph containing both: (1) a scatter plot (2) either histograms or kernel density functions of the Y and X variables to the left of the Y axis and below the X axis. I found a graph that does this in MATLAB -- I would just like to produce something similar in Stata: That graph was produced using the following MATLAB code: n = 1000; rho = .7; Z = mvnrnd([0 0], [1 rho; rho 1], n); U = normcdf(Z); X = [gaminv(U(:,1),2,1) tinv(U(:,2),5)]; [n1,ctr1] = hist(X(:,1),20); [n2,ctr2] = hist(X(:,2),20); subplot(2,2,2); plot(X(:,1),X(:,2),'.'); axis([0 12 -8 8]); h1 = gca; title('1000 Simulated Dependent t and Gamma Values'); xlabel('X1 ~ Gamma(2,1)'); ylabel('X2 ~ t(5)'); subplot(2,2,4); bar(ctr1,-n1,1); axis([0 12 -max(n1)*1.1 0]); axis('off'); h2 = gca; subplot(2,2,1); barh(ctr2,-n2,1); axis([-max(n2)*1.1 0 -8 8]); axis('off'); h3 = gca; set(h1,'Position',[0.35 0.35 0.55 0.55]); set(h2,'Position',[.35 .1 .55 .15]); set(h3,'Position',[.1 .35 .15 .55]); colormap([.8 .8 1]); UPDATE: The Stata13 manual entry for "graph combine" has precisely this example (http://www.stata.com/manuals13/g-2graphcombine.pdf). Here is the code: use http://www.stata-press.com/data/r13/lifeexp, clear generate loggnp = log10(gnppc) label var loggnp "Log base 10 of GNP per capita" scatter lexp loggnp, ysca(alt) xsca(alt) xlabel(, grid gmax) fysize(25) saving(yx) twoway histogram lexp, fraction xsca(alt reverse) horiz fxsize(25) saving(hy) twoway histogram loggnp, fraction ysca(alt reverse) ylabel(,nogrid) xlabel(,grid gmax) saving(hx) graph combine hy.gph yx.gph hx.gph, hole(3) imargin(0 0 0 0) graphregion(margin(l=22 r=22)) title("Life expectancy at birth vs. GNP per capita") note("Source: 1998 data from The World Bank Group")

    Read the article

  • Why my application ask for a codec to pla the MVI(.MOV) video files while i can play them on WMP and QuickTime?

    - by Daniel Lip
    I have an application i did some time ago when im loading the video file its ok when trying to play/use the file im getting the messageBox message say that its need a codec to use gspot or search the internet. Wehn im playing this files on my hard disk with Windows Media Play or either QuickTime there is no problems. The Video files for example name are: MVI_2483 in the file name properties i see its type: Quick Time Movie (.MOV) In my application im using DirectShowLib-2005.dll this is the class im using in my case to extract the video file im using it in my application to extract only lightnings from the video file name. In Form1 i have a button click event that just starting the action: private void button8_Click(object sender, EventArgs e) { viewToolStripMenuItem.Enabled = false; fileToolStripMenuItem.Enabled = false; button2.Enabled = false; label14.Visible = false; label15.Visible = false; label21.Visible = false; label22.Visible = false; label24.Visible = false; label25.Visible = false; ExtractAutomatic = true; DirectoryInfo info = new DirectoryInfo(_videoFile); string dirName = info.Name; automaticModeDirectory = dirName + "_Automatic"; subDirectoryName = _outputDir + "\\" + automaticModeDirectory; if (secondPass == true) { Start(true); } Start(false); } This is the function start in Form1: private void Start(bool secondpass) { setpicture(-1); if (Directory.Exists(_outputDir) && secondpass == false) { } else { Directory.CreateDirectory(_outputDir); } if (ExtractAutomatic == true) { string subDirectory_Automatic_Name = _outputDir + "\\" + automaticModeDirectory; Directory.CreateDirectory(subDirectory_Automatic_Name); f = new WmvAdapter(_videoFile, Path.Combine(subDirectory_Automatic_Name)); } else { string subDirectory_Manual_Name; if (Directory.Exists(subDirectoryName)) { subDirectory_Manual_Name = subDirectoryName; f = new WmvAdapter(_videoFile, Path.Combine(subDirectory_Manual_Name)); } else { subDirectory_Manual_Name = _outputDir + "\\" + averagesListTextFileDirectory + "_Manual"; Directory.CreateDirectory(subDirectory_Manual_Name); f = new WmvAdapter(_videoFile, Path.Combine(subDirectory_Manual_Name)); } } button1.Enabled = false; f.Secondpass = secondpass; f.FramesToSave = _fts; f.FrameCountAvailable += new WmvAdapter.FrameCountEventHandler(f_FrameCountAvailable); f.StatusChanged += new WmvAdapter.EventHandler(f_StatusChanged); f.ProgressChanged += new WmvAdapter.ProgressEventHandler(f_ProgressChanged); this.Text = "Processing Please Wait..."; label5.ForeColor = Color.Green; label5.Text = "Processing Please Wait"; button8.Enabled = false; button5.Enabled = false; label5.Visible = true; pictureBox1.Image = Lightnings_Extractor.Properties.Resources.Weather_Michmoret; Hrs = 0; //number of hours Min = 0; //number of Minutes Sec = 0; //number of Sec timeElapsed = 0; label10.Text = "00:00:00"; label11.Visible = false; label12.Visible = false; label9.Visible = false; label8.Visible = false; this.button1.Enabled = false; myTrackPanelss1.trackBar1.Enabled = false; this.checkBox2.Enabled = false; this.checkBox1.Enabled = false; numericUpDown1.Enabled = false; timer1.Start(); label2.Text = ""; label1.Visible = true; label2.Visible = true; label3.Visible = true; label4.Visible = true; f.Start(); } And this is the class wich is not my oqn class i just just defined it in some places wich making the problem: using System; using System.Diagnostics; using System.Drawing; using System.Drawing.Imaging; using System.IO; using System.Runtime.InteropServices; using DirectShowLib; using System.Collections.Generic; using Extracting_Frames; using System.Windows.Forms; namespace Polkan.DataSource { internal class WmvAdapter : ISampleGrabberCB, IDisposable { #region Fields_Properties_and_Events bool dis = false; int count = 0; const string fileName = @"d:\histogramValues.dat"; private IFilterGraph2 _filterGraph; private IMediaControl _mediaCtrl; private IMediaEvent _mediaEvent; private int _width; private int _height; private readonly string _outFolder; private int _frameId; //better use a custom EventHandler that passes the results of the action to the subscriber. public delegate void EventHandler(object sender, EventArgs e); public event EventHandler StatusChanged; public delegate void FrameCountEventHandler(object sender, FrameCountEventArgs e); public event FrameCountEventHandler FrameCountAvailable; public delegate void ProgressEventHandler(object sender, ProgressEventArgs e); public event ProgressEventHandler ProgressChanged; private IMediaSeeking _mSeek; private long _duration = 0; private long _avgFrameTime = 0; //just save the averages to a List (not to fs) public List<double> AveragesList { get; set; } public List<long> histogramValuesList; public bool Secondpass { get; set; } public List<int> FramesToSave { get; set; } #endregion #region Constructors and Destructors public WmvAdapter(string file, string outFolder) { _outFolder = outFolder; try { SetupGraph(file); } catch { Dispose(); MessageBox.Show("A codec is required to load this video file. Please use http://www.headbands.com/gspot/ or search the web for the correct codec"); } } ~WmvAdapter() { CloseInterfaces(); } #endregion public void Dispose() { CloseInterfaces(); } public void Start() { EstimateFrameCount(); int hr = _mediaCtrl.Run(); WaitUntilDone(); DsError.ThrowExceptionForHR(hr); } public void WaitUntilDone() { int hr; const int eAbort = unchecked((int)0x80004004); do { System.Windows.Forms.Application.DoEvents(); EventCode evCode; if (dis == true) { return; } hr = _mediaEvent.WaitForCompletion(100, out evCode); }while (hr == eAbort); DsError.ThrowExceptionForHR(hr); OnStatusChanged(); } //Edit: added events protected virtual void OnStatusChanged() { if (StatusChanged != null) StatusChanged(this, new EventArgs()); } protected virtual void OnFrameCountAvailable(long frameCount) { if (FrameCountAvailable != null) FrameCountAvailable(this, new FrameCountEventArgs() { FrameCount = frameCount }); } protected virtual void OnProgressChanged(int frameID) { if (ProgressChanged != null) ProgressChanged(this, new ProgressEventArgs() { FrameID = frameID }); } /// <summary> build the capture graph for grabber. </summary> private void SetupGraph(string file) { ISampleGrabber sampGrabber = null; IBaseFilter capFilter = null; IBaseFilter nullrenderer = null; _filterGraph = (IFilterGraph2)new FilterGraph(); _mediaCtrl = (IMediaControl)_filterGraph; _mediaEvent = (IMediaEvent)_filterGraph; _mSeek = (IMediaSeeking)_filterGraph; var mediaFilt = (IMediaFilter)_filterGraph; try { // Add the video source int hr = _filterGraph.AddSourceFilter(file, "Ds.NET FileFilter", out capFilter); DsError.ThrowExceptionForHR(hr); // Get the SampleGrabber interface sampGrabber = new SampleGrabber() as ISampleGrabber; var baseGrabFlt = sampGrabber as IBaseFilter; ConfigureSampleGrabber(sampGrabber); // Add the frame grabber to the graph hr = _filterGraph.AddFilter(baseGrabFlt, "Ds.NET Grabber"); DsError.ThrowExceptionForHR(hr); // --------------------------------- // Connect the file filter to the sample grabber // Hopefully this will be the video pin, we could check by reading it's mediatype IPin iPinOut = DsFindPin.ByDirection(capFilter, PinDirection.Output, 0); // Get the input pin from the sample grabber IPin iPinIn = DsFindPin.ByDirection(baseGrabFlt, PinDirection.Input, 0); hr = _filterGraph.Connect(iPinOut, iPinIn); DsError.ThrowExceptionForHR(hr); // Add the null renderer to the graph nullrenderer = new NullRenderer() as IBaseFilter; hr = _filterGraph.AddFilter(nullrenderer, "Null renderer"); DsError.ThrowExceptionForHR(hr); // --------------------------------- // Connect the sample grabber to the null renderer iPinOut = DsFindPin.ByDirection(baseGrabFlt, PinDirection.Output, 0); iPinIn = DsFindPin.ByDirection(nullrenderer, PinDirection.Input, 0); hr = _filterGraph.Connect(iPinOut, iPinIn); DsError.ThrowExceptionForHR(hr); // Turn off the clock. This causes the frames to be sent // thru the graph as fast as possible hr = mediaFilt.SetSyncSource(null); DsError.ThrowExceptionForHR(hr); // Read and cache the image sizes SaveSizeInfo(sampGrabber); //Edit: get the duration hr = _mSeek.GetDuration(out _duration); DsError.ThrowExceptionForHR(hr); } finally { if (capFilter != null) { Marshal.ReleaseComObject(capFilter); } if (sampGrabber != null) { Marshal.ReleaseComObject(sampGrabber); } if (nullrenderer != null) { Marshal.ReleaseComObject(nullrenderer); } GC.Collect(); } } private void EstimateFrameCount() { try { //1sec / averageFrameTime double fr = 10000000.0 / _avgFrameTime; double frameCount = fr * (_duration / 10000000.0); OnFrameCountAvailable((long)frameCount); } catch { } } public double framesCounts() { double fr = 10000000.0 / _avgFrameTime; double frameCount = fr * (_duration / 10000000.0); return frameCount; } private void SaveSizeInfo(ISampleGrabber sampGrabber) { // Get the media type from the SampleGrabber var media = new AMMediaType(); int hr = sampGrabber.GetConnectedMediaType(media); DsError.ThrowExceptionForHR(hr); if ((media.formatType != FormatType.VideoInfo) || (media.formatPtr == IntPtr.Zero)) { throw new NotSupportedException("Unknown Grabber Media Format"); } // Grab the size info var videoInfoHeader = (VideoInfoHeader)Marshal.PtrToStructure(media.formatPtr, typeof(VideoInfoHeader)); _width = videoInfoHeader.BmiHeader.Width; _height = videoInfoHeader.BmiHeader.Height; //Edit: get framerate _avgFrameTime = videoInfoHeader.AvgTimePerFrame; DsUtils.FreeAMMediaType(media); GC.Collect(); } private void ConfigureSampleGrabber(ISampleGrabber sampGrabber) { var media = new AMMediaType { majorType = MediaType.Video, subType = MediaSubType.RGB24, formatType = FormatType.VideoInfo }; int hr = sampGrabber.SetMediaType(media); DsError.ThrowExceptionForHR(hr); DsUtils.FreeAMMediaType(media); GC.Collect(); hr = sampGrabber.SetCallback(this, 1); DsError.ThrowExceptionForHR(hr); } private void CloseInterfaces() { try { if (_mediaCtrl != null) { _mediaCtrl.Stop(); _mediaCtrl = null; dis = true; } } catch (Exception ex) { Debug.WriteLine(ex); } if (_filterGraph != null) { Marshal.ReleaseComObject(_filterGraph); _filterGraph = null; } GC.Collect(); } int ISampleGrabberCB.SampleCB(double sampleTime, IMediaSample pSample) { Marshal.ReleaseComObject(pSample); return 0; } int ISampleGrabberCB.BufferCB(double sampleTime, IntPtr pBuffer, int bufferLen) { if (Form1.ExtractAutomatic == true) { using (var bitmap = new Bitmap(_width, _height, _width * 3, PixelFormat.Format24bppRgb, pBuffer)) { if (!this.Secondpass) { long[] HistogramValues = Form1.GetHistogram(bitmap); long t = Form1.GetTopLumAmount(HistogramValues, 1000); Form1.averagesTest.Add(t); } else { //this is the changed part if (_frameId > 0) { if (Form1.averagesTest[_frameId] / 1000.0 - Form1.averagesTest[_frameId - 1] / 1000.0 > 150.0) { count = 6; } if (count > 0) { bitmap.RotateFlip(RotateFlipType.Rotate180FlipX); bitmap.Save(Path.Combine(_outFolder, _frameId.ToString("D6") + ".bmp")); count --; } } } _frameId++; //let only report each 100 frames for performance if (_frameId % 100 == 0) OnProgressChanged(_frameId); } } else { using (var bitmap = new Bitmap(_width, _height, _width * 3, PixelFormat.Format24bppRgb, pBuffer)) { if (!this.Secondpass) { //get avg double average = GetAveragePixelValue(bitmap); if (AveragesList == null) AveragesList = new List<double>(); //save avg AveragesList.Add(average); //***************************\\ // for (int i = 0; i < (int)framesCounts(); i++) // { // get histogram values long[] HistogramValues = Form1.GetHistogram(bitmap); if (histogramValuesList == null) histogramValuesList = new List<long>(256); histogramValuesList.AddRange(HistogramValues); //***************************\\ //} } else { if (FramesToSave != null && FramesToSave.Contains(_frameId)) { bitmap.RotateFlip(RotateFlipType.Rotate180FlipX); bitmap.Save(Path.Combine(_outFolder, _frameId.ToString("D6") + ".bmp")); // get histogram values long[] HistogramValues = Form1.GetHistogram(bitmap); if (histogramValuesList == null) histogramValuesList = new List<long>(256); histogramValuesList.AddRange(HistogramValues); using (BinaryWriter binWriter = new BinaryWriter(File.Open(fileName, FileMode.Create))) { for (int i = 0; i < histogramValuesList.Count; i++) { binWriter.Write(histogramValuesList[(int)i]); } binWriter.Close(); } } } _frameId++; //let only report each 100 frames for performance if (_frameId % 100 == 0) OnProgressChanged(_frameId); } } return 0; } /* int ISampleGrabberCB.SampleCB(double sampleTime, IMediaSample pSample) { Marshal.ReleaseComObject(pSample); return 0; } int ISampleGrabberCB.BufferCB(double sampleTime, IntPtr pBuffer, int bufferLen) { using (var bitmap = new Bitmap(_width, _height, _width * 3, PixelFormat.Format24bppRgb, pBuffer)) { if (!this.Secondpass) { //get avg double average = GetAveragePixelValue(bitmap); if (AveragesList == null) AveragesList = new List<double>(); //save avg AveragesList.Add(average); //***************************\\ // for (int i = 0; i < (int)framesCounts(); i++) // { // get histogram values long[] HistogramValues = Form1.GetHistogram(bitmap); if (histogramValuesList == null) histogramValuesList = new List<long>(256); histogramValuesList.AddRange(HistogramValues); long t = Form1.GetTopLumAmount(HistogramValues, 1000); //***************************\\ Form1.averagesTest.Add(t); // to add this list to a text file or binary file and read the averages from the file when its is Secondpass !!!!! //} } else { if (FramesToSave != null && FramesToSave.Contains(_frameId)) { bitmap.RotateFlip(RotateFlipType.Rotate180FlipX); bitmap.Save(Path.Combine(_outFolder, _frameId.ToString("D6") + ".bmp")); // get histogram values long[] HistogramValues = Form1.GetHistogram(bitmap); if (histogramValuesList == null) histogramValuesList = new List<long>(256); histogramValuesList.AddRange(HistogramValues); using (BinaryWriter binWriter = new BinaryWriter(File.Open(fileName, FileMode.Create))) { for (int i = 0; i < histogramValuesList.Count; i++) { binWriter.Write(histogramValuesList[(int)i]); } binWriter.Close(); } } for (int x = 1; x < Form1.averagesTest.Count; x++) { double fff = Form1.averagesTest[x] / 1000.0 - Form1.averagesTest[x - 1] / 1000.0; if (Form1.averagesTest[x] / 1000.0 - Form1.averagesTest[x - 1] / 1000.0 > 180.0) { bitmap.RotateFlip(RotateFlipType.Rotate180FlipX); bitmap.Save(Path.Combine(_outFolder, _frameId.ToString("D6") + ".bmp")); _frameId++; } } } _frameId++; //let only report each 100 frames for performance if (_frameId % 100 == 0) OnProgressChanged(_frameId); } return 0; }*/ private unsafe double GetAveragePixelValue(Bitmap bmp) { BitmapData bmData = null; try { bmData = bmp.LockBits(new Rectangle(0, 0, bmp.Width, bmp.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb); int stride = bmData.Stride; IntPtr scan0 = bmData.Scan0; int w = bmData.Width; int h = bmData.Height; double sum = 0; long pixels = bmp.Width * bmp.Height; byte* p = (byte*)scan0.ToPointer(); for (int y = 0; y < h; y++) { p = (byte*)scan0.ToPointer(); p += y * stride; for (int x = 0; x < w; x++) { double i = ((double)p[0] + p[1] + p[2]) / 3.0; sum += i; p += 3; } //no offset incrementation needed when getting //the pointer at the start of each row } bmp.UnlockBits(bmData); double result = sum / (double)pixels; return result; } catch { try { bmp.UnlockBits(bmData); } catch { } } return -1; } } public class FrameCountEventArgs { public long FrameCount { get; set; } } public class ProgressEventArgs { public int FrameID { get; set; } } } I remember i had this codec problem/s before and i installed the codec/'s that were needed but in this case both quick time and windows media player can play the video files so why the application cant detect and find the codec/'s on my computer ? Gspot say that the codec is AVC1 but again wmp and quicktime play the video files no problems. The video files are from my digital camera !

    Read the article

  • Is it possible to plot a single density over a discrete variable?

    - by mattrepl
    The x-axis is time broken up into time intervals. There is an interval column in the data frame that specifies the time for each row. Plotting a histogram or line using geom_histogram and geom_freqpoly works great, but I'd like to use geom_density to get a filled area. Perhaps there is a better way to achieve this. Right now, if I use geom_density, curves are created for each discrete factor level instead of smoothing over all of them.

    Read the article

  • MATLAB : frequency distribution

    - by Arkapravo
    I have raw observations of 500 numeric values (ranging from 1 to 25000) in a text file, I wish to make a frequency distribution in MATLAB. I did try the histogram (hist), however I would prefer a frequency distribution curve than blocks and bars. Any help is appreciated !

    Read the article

  • Can ggplot2 work with R's canvas backend

    - by Casbon
    Having installed canvas from here http://www.rforge.net/canvas/files/ I try to plot: > canvas('test.js') > qplot(rnorm(100), geom='histogram') stat_bin: binwidth defaulted to range/30. Use 'binwidth = x' to adjust this. Error in grid.Call.graphics("L_setviewport", pvp, TRUE) : Non-finite location and/or size for viewport >

    Read the article

  • How to convert a one column integer data file into a mask for image

    - by gavishna
    I have a data file which contains integers say in range 0-255 containing about 1000 integers which are random in nature.I want to use that as a mask or to multiply an image which is in RGb and another image which is in gray format. HOw do i go about this, how do i convert/represent this data file in matrix format of image dimension ?Kindly suggest. also is it possible to obtain a 3D histogram?

    Read the article

  • Utility to record IO statistics (random/sequential, block sizes, read/write ratio) in Unix

    - by Michael Pearson
    As part of provisioning our new server (see other SF) I'd like to find out the following: ratio of random to sequential reads & writes amount of data read & written at a time (pref in histogram form) I can already figure out our reads/writes on a per-operation and overall data level using iostat & dstat, but I'd like to know more. For example, I'd like to know that we're mostly random 16kb reads, or a lot of sequential 64kb reads with random writes. We're (currently) on an Ubuntu 10.04 VM. Is there a utility that I can run that will record and present this information for me?

    Read the article

  • Hello Operator, My Switch Is Bored

    - by Paul White
    This is a post for T-SQL Tuesday #43 hosted by my good friend Rob Farley. The topic this month is Plan Operators. I haven’t taken part in T-SQL Tuesday before, but I do like to write about execution plans, so this seemed like a good time to start. This post is in two parts. The first part is primarily an excuse to use a pretty bad play on words in the title of this blog post (if you’re too young to know what a telephone operator or a switchboard is, I hate you). The second part of the post looks at an invisible query plan operator (so to speak). 1. My Switch Is Bored Allow me to present the rare and interesting execution plan operator, Switch: Books Online has this to say about Switch: Following that description, I had a go at producing a Fast Forward Cursor plan that used the TOP operator, but had no luck. That may be due to my lack of skill with cursors, I’m not too sure. The only application of Switch in SQL Server 2012 that I am familiar with requires a local partitioned view: CREATE TABLE dbo.T1 (c1 int NOT NULL CHECK (c1 BETWEEN 00 AND 24)); CREATE TABLE dbo.T2 (c1 int NOT NULL CHECK (c1 BETWEEN 25 AND 49)); CREATE TABLE dbo.T3 (c1 int NOT NULL CHECK (c1 BETWEEN 50 AND 74)); CREATE TABLE dbo.T4 (c1 int NOT NULL CHECK (c1 BETWEEN 75 AND 99)); GO CREATE VIEW V1 AS SELECT c1 FROM dbo.T1 UNION ALL SELECT c1 FROM dbo.T2 UNION ALL SELECT c1 FROM dbo.T3 UNION ALL SELECT c1 FROM dbo.T4; Not only that, but it needs an updatable local partitioned view. We’ll need some primary keys to meet that requirement: ALTER TABLE dbo.T1 ADD CONSTRAINT PK_T1 PRIMARY KEY (c1);   ALTER TABLE dbo.T2 ADD CONSTRAINT PK_T2 PRIMARY KEY (c1);   ALTER TABLE dbo.T3 ADD CONSTRAINT PK_T3 PRIMARY KEY (c1);   ALTER TABLE dbo.T4 ADD CONSTRAINT PK_T4 PRIMARY KEY (c1); We also need an INSERT statement that references the view. Even more specifically, to see a Switch operator, we need to perform a single-row insert (multi-row inserts use a different plan shape): INSERT dbo.V1 (c1) VALUES (1); And now…the execution plan: The Constant Scan manufactures a single row with no columns. The Compute Scalar works out which partition of the view the new value should go in. The Assert checks that the computed partition number is not null (if it is, an error is returned). The Nested Loops Join executes exactly once, with the partition id as an outer reference (correlated parameter). The Switch operator checks the value of the parameter and executes the corresponding input only. If the partition id is 0, the uppermost Clustered Index Insert is executed, adding a row to table T1. If the partition id is 1, the next lower Clustered Index Insert is executed, adding a row to table T2…and so on. In case you were wondering, here’s a query and execution plan for a multi-row insert to the view: INSERT dbo.V1 (c1) VALUES (1), (2); Yuck! An Eager Table Spool and four Filters! I prefer the Switch plan. My guess is that almost all the old strategies that used a Switch operator have been replaced over time, using things like a regular Concatenation Union All combined with Start-Up Filters on its inputs. Other new (relative to the Switch operator) features like table partitioning have specific execution plan support that doesn’t need the Switch operator either. This feels like a bit of a shame, but perhaps it is just nostalgia on my part, it’s hard to know. Please do let me know if you encounter a query that can still use the Switch operator in 2012 – it must be very bored if this is the only possible modern usage! 2. Invisible Plan Operators The second part of this post uses an example based on a question Dave Ballantyne asked using the SQL Sentry Plan Explorer plan upload facility. If you haven’t tried that yet, make sure you’re on the latest version of the (free) Plan Explorer software, and then click the Post to SQLPerformance.com button. That will create a site question with the query plan attached (which can be anonymized if the plan contains sensitive information). Aaron Bertrand and I keep a close eye on questions there, so if you have ever wanted to ask a query plan question of either of us, that’s a good way to do it. The problem The issue I want to talk about revolves around a query issued against a calendar table. The script below creates a simplified version and adds 100 years of per-day information to it: USE tempdb; GO CREATE TABLE dbo.Calendar ( dt date NOT NULL, isWeekday bit NOT NULL, theYear smallint NOT NULL,   CONSTRAINT PK__dbo_Calendar_dt PRIMARY KEY CLUSTERED (dt) ); GO -- Monday is the first day of the week for me SET DATEFIRST 1;   -- Add 100 years of data INSERT dbo.Calendar WITH (TABLOCKX) (dt, isWeekday, theYear) SELECT CA.dt, isWeekday = CASE WHEN DATEPART(WEEKDAY, CA.dt) IN (6, 7) THEN 0 ELSE 1 END, theYear = YEAR(CA.dt) FROM Sandpit.dbo.Numbers AS N CROSS APPLY ( VALUES (DATEADD(DAY, N.n - 1, CONVERT(date, '01 Jan 2000', 113))) ) AS CA (dt) WHERE N.n BETWEEN 1 AND 36525; The following query counts the number of weekend days in 2013: SELECT Days = COUNT_BIG(*) FROM dbo.Calendar AS C WHERE theYear = 2013 AND isWeekday = 0; It returns the correct result (104) using the following execution plan: The query optimizer has managed to estimate the number of rows returned from the table exactly, based purely on the default statistics created separately on the two columns referenced in the query’s WHERE clause. (Well, almost exactly, the unrounded estimate is 104.289 rows.) There is already an invisible operator in this query plan – a Filter operator used to apply the WHERE clause predicates. We can see it by re-running the query with the enormously useful (but undocumented) trace flag 9130 enabled: Now we can see the full picture. The whole table is scanned, returning all 36,525 rows, before the Filter narrows that down to just the 104 we want. Without the trace flag, the Filter is incorporated in the Clustered Index Scan as a residual predicate. It is a little bit more efficient than using a separate operator, but residual predicates are still something you will want to avoid where possible. The estimates are still spot on though: Anyway, looking to improve the performance of this query, Dave added the following filtered index to the Calendar table: CREATE NONCLUSTERED INDEX Weekends ON dbo.Calendar(theYear) WHERE isWeekday = 0; The original query now produces a much more efficient plan: Unfortunately, the estimated number of rows produced by the seek is now wrong (365 instead of 104): What’s going on? The estimate was spot on before we added the index! Explanation You might want to grab a coffee for this bit. Using another trace flag or two (8606 and 8612) we can see that the cardinality estimates were exactly right initially: The highlighted information shows the initial cardinality estimates for the base table (36,525 rows), the result of applying the two relational selects in our WHERE clause (104 rows), and after performing the COUNT_BIG(*) group by aggregate (1 row). All of these are correct, but that was before cost-based optimization got involved :) Cost-based optimization When cost-based optimization starts up, the logical tree above is copied into a structure (the ‘memo’) that has one group per logical operation (roughly speaking). The logical read of the base table (LogOp_Get) ends up in group 7; the two predicates (LogOp_Select) end up in group 8 (with the details of the selections in subgroups 0-6). These two groups still have the correct cardinalities as trace flag 8608 output (initial memo contents) shows: During cost-based optimization, a rule called SelToIdxStrategy runs on group 8. It’s job is to match logical selections to indexable expressions (SARGs). It successfully matches the selections (theYear = 2013, is Weekday = 0) to the filtered index, and writes a new alternative into the memo structure. The new alternative is entered into group 8 as option 1 (option 0 was the original LogOp_Select): The new alternative is to do nothing (PhyOp_NOP = no operation), but to instead follow the new logical instructions listed below the NOP. The LogOp_GetIdx (full read of an index) goes into group 21, and the LogOp_SelectIdx (selection on an index) is placed in group 22, operating on the result of group 21. The definition of the comparison ‘the Year = 2013’ (ScaOp_Comp downwards) was already present in the memo starting at group 2, so no new memo groups are created for that. New Cardinality Estimates The new memo groups require two new cardinality estimates to be derived. First, LogOp_Idx (full read of the index) gets a predicted cardinality of 10,436. This number comes from the filtered index statistics: DBCC SHOW_STATISTICS (Calendar, Weekends) WITH STAT_HEADER; The second new cardinality derivation is for the LogOp_SelectIdx applying the predicate (theYear = 2013). To get a number for this, the cardinality estimator uses statistics for the column ‘theYear’, producing an estimate of 365 rows (there are 365 days in 2013!): DBCC SHOW_STATISTICS (Calendar, theYear) WITH HISTOGRAM; This is where the mistake happens. Cardinality estimation should have used the filtered index statistics here, to get an estimate of 104 rows: DBCC SHOW_STATISTICS (Calendar, Weekends) WITH HISTOGRAM; Unfortunately, the logic has lost sight of the link between the read of the filtered index (LogOp_GetIdx) in group 22, and the selection on that index (LogOp_SelectIdx) that it is deriving a cardinality estimate for, in group 21. The correct cardinality estimate (104 rows) is still present in the memo, attached to group 8, but that group now has a PhyOp_NOP implementation. Skipping over the rest of cost-based optimization (in a belated attempt at brevity) we can see the optimizer’s final output using trace flag 8607: This output shows the (incorrect, but understandable) 365 row estimate for the index range operation, and the correct 104 estimate still attached to its PhyOp_NOP. This tree still has to go through a few post-optimizer rewrites and ‘copy out’ from the memo structure into a tree suitable for the execution engine. One step in this process removes PhyOp_NOP, discarding its 104-row cardinality estimate as it does so. To finish this section on a more positive note, consider what happens if we add an OVER clause to the query aggregate. This isn’t intended to be a ‘fix’ of any sort, I just want to show you that the 104 estimate can survive and be used if later cardinality estimation needs it: SELECT Days = COUNT_BIG(*) OVER () FROM dbo.Calendar AS C WHERE theYear = 2013 AND isWeekday = 0; The estimated execution plan is: Note the 365 estimate at the Index Seek, but the 104 lives again at the Segment! We can imagine the lost predicate ‘isWeekday = 0’ as sitting between the seek and the segment in an invisible Filter operator that drops the estimate from 365 to 104. Even though the NOP group is removed after optimization (so we don’t see it in the execution plan) bear in mind that all cost-based choices were made with the 104-row memo group present, so although things look a bit odd, it shouldn’t affect the optimizer’s plan selection. I should also mention that we can work around the estimation issue by including the index’s filtering columns in the index key: CREATE NONCLUSTERED INDEX Weekends ON dbo.Calendar(theYear, isWeekday) WHERE isWeekday = 0 WITH (DROP_EXISTING = ON); There are some downsides to doing this, including that changes to the isWeekday column may now require Halloween Protection, but that is unlikely to be a big problem for a static calendar table ;)  With the updated index in place, the original query produces an execution plan with the correct cardinality estimation showing at the Index Seek: That’s all for today, remember to let me know about any Switch plans you come across on a modern instance of SQL Server! Finally, here are some other posts of mine that cover other plan operators: Segment and Sequence Project Common Subexpression Spools Why Plan Operators Run Backwards Row Goals and the Top Operator Hash Match Flow Distinct Top N Sort Index Spools and Page Splits Singleton and Range Seeks Bitmaps Hash Join Performance Compute Scalar © 2013 Paul White – All Rights Reserved Twitter: @SQL_Kiwi

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 28 (sys.dm_db_stats_properties)

    - by Tamarick Hill
    The sys.dm_db_stats_properties Dynamic Management Function returns information about the statistics that are currently on your database objects. This function takes two parameters, an object_id and a stats_id. Let’s have a look at the result set from this function against the AdventureWorks2012.Sales.SalesOrderHeader table. To obtain the object_id and stats_id I will use a CROSS APPLY with the sys.stats system table. SELECT sp.* FROM sys.stats s CROSS APPLY sys.dm_db_stats_properties(s.object_id, s.Stats_id) sp WHERE sp.object_id = object_id('Sales.SalesOrderHeader') The first two columns returned by this function are the object_id and the stats_id columns. The next column, ‘last_updated’, gives you the date and the time that a particular statistic was last updated. The next column, ‘rows’, gives you the total number of rows in the table as of the last statistic update date. The ‘rows_sampled’ column gives you the number of rows that were sampled to create the statistic. The ‘steps’ column represents the number of specific value ranges from the statistic histogram. The ‘unfiltered_rows’ column represents the number of rows before any filters are applied. If a particular statistic is not filtered, the ‘unfiltered_rows’ column will always equal the ‘rows’ column. Lastly we have the ‘modification_counter’ column which represents the number of modification to the leading column in a given statistic since the last time the statistic was updated. Probably the most important column from this Dynamic Management Function is the ‘last_updated’ column. You want to always ensure that you have accurate and updated statistics on your database objects. Accurate statistics are vital for the query optimizer to generate efficient and reliable query execution plans. Without accurate and updated statistics, the performance of your SQL Server would likely suffer. For more information about this Dynamic Management Function, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/jj553546.aspx Folllow me on Twitter @PrimeTimeDBA

    Read the article

  • Ask the Readers: Do You Use the Command Line?

    - by Jason Fitzpatrick
    Despite over two decades of GUI interfaces many power users still turn to the command prompt. This week we want to hear about when and how you use the command prompt on your computer. Long ago in a time before you could manipulate your computer with a mouse and a series of buttons and windows, the command line ruled all. Even after years of GUI development and refinement many people still turn to the command line to get things done. This week we want to hear all about your command line tips and tricks. Do you use the default command line for your OS? Have you enhanced it? Replaced it? What keeps you coming back to the command line when everyone happily works away in the OS’s GUI? Sound off in the comments and don’t forget to check back in on Friday to see the What You Said roundup. What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • Happy Tau Day! (Or: How Some Mathematicians Think We Should Retire Pi) [Video]

    - by Jason Fitzpatrick
    When you were in school you learned all about Pi and its relationship to circles and turn-based geometry. Some mathematicians are rallying for a new lesson, on about Tau. Michael Hartl is a mathematician on a mission, a mission to get people away from using Pi and to start using Tau. His manifesto opens: Welcome to The Tau Manifesto. This manifesto is dedicated to one of the most important numbers in mathematics, perhaps the most important: the circle constant relating the circumference of a circle to its linear dimension. For millennia, the circle has been considered the most perfect of shapes, and the circle constant captures the geometry of the circle in a single number. Of course, the traditional choice of circle constant is p—but, as mathematician Bob Palais notes in his delightful article “p Is Wrong!”,1 p is wrong. It’s time to set things right. Why is Pi wrong? Among the arguments is that Tau is the ration of a circumference to the radius of a circle and defining circles by their radius is more natural and that Pi is a 2-factor number but with Tau everything is based of a single unit–three quarters of a turn around a Tau-defined circle is simply three quarters of a Tau radian. Watch the video above to see the Tau sequence (which begins 6.2831853071…) turned into a musical composition. For more information about Tau hit up the link below to read the manifesto. The Tau Manifesto [TauDay] HTG Explains: Photography with Film-Based CamerasHow to Clean Your Dirty Smartphone (Without Breaking Something)What is a Histogram, and How Can I Use it to Improve My Photos?

    Read the article

  • Shooting Print Quality Pictures with a Camera Phone [Video]

    - by Jason Fitzpatrick
    Camera phones get a lot of grief for being underpowered but in this video tutorial the crew at SLR Lounge shows how basic technique and a good eye overcome all. Last year over at the photo blog FStoppers they put together a video showing off how you could use the iPhone as a fashion camera–essentially arguing that the camera wasn’t as important as the photographer. A lot of people said “Well yeah, but you had professional models and thousands of dollars in lighting equipment!” in reaction to the video. In turn the crew at SLR Lounge decided to make their own video showing that using only an iPhone camera and two reflectors (as well as an attractive but informal model). It of course helps to have some side kicks to help hold up your reflectors but the point still stands about modern camera phones being perfectly capable of good photos. The SLR Lounge iPhone Photo Shoot – A Follow Up Tribute to The FStoppers [YouTube] What is a Histogram, and How Can I Use it to Improve My Photos?How To Easily Access Your Home Network From Anywhere With DDNSHow To Recover After Your Email Password Is Compromised

    Read the article

  • problem with hand tracking, opencv

    - by JP Talusan
    I am currently creating an opencv program that identifies a hand in an image and then gets the contour of the hand only, in order for us to get the center (x,y)m in pixels, of the hand. The problem is that whenever the image or video includes an arm or a face, we can't split or separate the hand from the contours of the arm or the face. We are currently using an HSV flesh colored histogram to get the contours of the hand. is there a way to separate them, i just need the hand. also if the picture includes only a hand and some part of the arm. How can we isolate the palm itself from the rest of the picture. all we need is a clear center of the palm. thanks in advanced.

    Read the article

  • image focus calculation

    - by Oren Mazor
    Hi folks, I'm trying to develop an image focusing algorithm for some test automation work. I've chosen to use AForge.net, since it seems like a nice mature .net friendly system. Unfortunately, I can't seem to find information on building autofocus algorithms from scratch, so I've given it my best try: take image. apply sobel edge detection filter, which generates a greyscale edge outline. generate a histogram and save the standard dev. move camera one step closer to subject and take another picture. if the standard dev is smaller than previous one, we're getting more in focus. otherwise, we've past the optimal distance to be taking pictures. is there a better way? update: HUGE flaw in this, by the way. as I get past the optimal focus point, my "image in focus" value continues growing. you'd expect a parabolic-ish function looking at distance/focus-value, but in reality you get something that's more logarithmic

    Read the article

  • Image Application in WPF and Perfomance.

    - by Harsha
    Hello All, I am planning to build Image processing application using WPF. Brightness /Contrast and Histogram are main operation of this application. I have downloaded the application " Foundations: Bitmaps and Pixel Bits" from http://msdn.microsoft.com/en-us/magazine/cc534995.aspx . But when I tried to open the images which are more than 1200x1600, It is very slow. How to increase the performance. Is any one worked on Image processing in WPF. Please suggest me how to solve this perfomance issue in WPF for image(more than 1600x1200) operation. Thanks you, Harsha

    Read the article

  • Copying a 14bit grayscale image (saved in long[]) to a pictureBox

    - by Itsik
    My camera gives me 14bit grayscale images, but the API's function returns a long* to the image data. (so i'm assuming 4 bytes for each pixel) My application is written in C++/CLI, and the pictureBox is of .NET type. I am currently using the BitmapData.LockBits() mechanism to gain pointer access to the image data, and using memcpy(bmpData.Scan0.ToPointer(), imageData, sizeof(long)*height*width) to copy the image data to the Bitmap. For now, the only PixelFormat that is working is 32bit RGB, and the image appears in shades of blue with contours. Trying to initialize the Bitmap as 16bppGrayscale isn't working. I would ideally want to cast the array from long to word and using a 16bit format (hoping the the 14bit data will be displayed properly) but I'm not sure if this works. Also, I don't want to iterate over the image data, so finding the min/max and then histogram stretching to [0..255] isnt an option for me (the display must be as efficient as possible) Thanks

    Read the article

  • What is the simplest method to fill the area under a geom_freqpoly line?

    - by mattrepl
    The x-axis is time broken up into time intervals. There is an interval column in the data frame that specifies the time for each row. The column is a factor, where each interval is a different factor level. Plotting a histogram or line using geom_histogram and geom_freqpoly works great, but I'd like to have a line, like that provided by geom_freqpoly, with the area filled. Currently I'm using geom_freqpoly like this: ggplot(quake.data, aes(interval, fill=tweet.type)) + geom_freqpoly(aes(group = tweet.type, colour = tweet.type)) + opts(axis.text.x=theme_text(angle=-60, hjust=0, size = 6)) I would prefer to have a filled area, such as provided by geom_density, but without smoothing the line: UPDATE: The geom_area has been suggested, is there any way to use a ggplot2-generated statistic, such as ..count.., for the geom_area's y-values? Or, does the count aggregation need to occur prior to using ggplot2?

    Read the article

  • Pig: Count number of keys in a map

    - by Donald Miner
    I'd like to count the number of keys in a map in Pig. I could write a UDF to do this, but I was hoping there would be an easier way. data = LOAD 'hbase://MARS1' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage( 'A:*', '-loadKey true -caching=100000') AS (id:bytearray, A_map:map[]); In the code above, I want to basically build a histogram of id and how many items in column family A that key has. In hoping, I tried c = FOREACH data GENERATE id, COUNT(A_map); but that unsurprisingly didn't work. Or, perhaps someone can suggest a better way to do this entirely. If I can't figure this out soon I'll just write a Java MapReduce job or a Pig UDF.

    Read the article

  • R: Cut and labels/breaks length conflict

    - by AkselO
    I am working with the cut function to prep data for a barplot histogram but keep running into a seeming inconsistency between my labels and breaks: Error in cut.default(sample(1:1e+05, 500, T), breaks = sq, labels = sprintf("$%.0f", : labels/breaks length conflict Here is an example. I pretend that it is income data, using a sequence of 0 to $100,000 in bins of $10,000. I use the same variable to generate both breaks and labels, with minor formating on the label side. I thought they might for some reason have different lengths when comparing to a character vector, but they appear to have the same length, still. > sq<-seq(0,100000,10000) > cut(sample(1:100000, 500, T),breaks=sq,labels=sprintf("$%.0f",sq)) > length(sprintf("$%.0f",sq)) [1] [11] > length(sq) [1] [11]

    Read the article

  • Large number of tables and Hibernate memory consumption

    - by Vedran
    I'm working on a large ERP project which has database model with about 2100 tables. With "only" 500 tables mapped with Hibernate, application deployed on the web server takes about 3GB of working memory. Is there any way to reduce Hibernate's metamodel memory footprint when using that many tables in one persistence unit? Or should I just give up on ORMs and go with plain old JDBC (or even jOOQ)? Right now I'm using Hibernate 4.1.8, Spring 3.1.3, JBoss AS 7.1 and working with MSSQL database. Edit: JavaMelody memory histogram output - with 2000 generated test tables that are a bit smaller in scope from the original db model (hence 'only' 1.3GB of spent memory)

    Read the article

  • how to visualize (value, count) dataset with thousands data points

    - by user510040
    I have a file with 2 numeric columns: value and count. File may have 5000 rows. I do plot(value, count) to find the shape of distribution. But because there are too many data points the picture is not very clear. Do you know better visualization approach? Probably histograms or barplot with grouping close values on x axis will be the better way to look on data? I cannot figure out the syntax of using histogram or barplot for my case.

    Read the article

  • What's the best way to install the GD graphics library for Nagios?

    - by user1196
    While trying to install Nagios 3.2.3, I ran their ./configure script and got these errors: checking for main in -liconv... no checking for gdImagePng in -lgd (order 1)... no checking for gdImagePng in -lgd (order 2)... no checking for gdImagePng in -lgd (order 3)... no checking for gdImagePng in -lgd (order 4)... no *** GD, PNG, and/or JPEG libraries could not be located... ********* Boutell's GD library is required to compile the statusmap, trends and histogram CGIs. Get it from http://www.boutell.com/gd/, compile it, and use the --with-gd-lib and --with-gd-inc arguments to specify the locations of the GD library and include files. NOTE: In addition to the gd-devel library, you'll also need to make sure you have the png-devel and jpeg-devel libraries installed on your system. NOTE: After you install the necessary libraries on your system: 1. Make sure /etc/ld.so.conf has an entry for the directory in which the GD, PNG, and JPEG libraries are installed. 2. Run 'ldconfig' to update the run-time linker options. 3. Run 'make clean' in the Nagios distribution to clean out any old references to your previous compile. 4. Rerun the configure script. NOTE: If you can't get the configure script to recognize the GD libs on your system, get over it and move on to other things. The CGIs that use the GD libs are just a small part of the entire Nagios package. Get everything else working first and then revisit the problem. Make sure to check the nagios-users mailing list archives for possible solutions to GD library problems when you resume your troubleshooting. ******************************************************************** Which package do I want? libgd2-xpm-dev? libgd2-noxpm-dev? php5-gd? I'm not looking to do any image processing myself - I just want to get Nagios working.

    Read the article

< Previous Page | 1 2 3 4 5  | Next Page >