Search Results

Search found 11362 results on 455 pages for 'big o analysis'.

Page 31/455 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • How to create a conditional measure in SQL Server 2008 Analysis services

    - by Jonathan
    Hi there I am not sure if the title has the correct terms, I have a developer and am very new to cubes. I have a cube which has data associated to materials that are broken down into chemical compounds. For example a rock material has 10% of this chemical and 10% of that chemical, etc. Samples are taken daily and sample is a dimension with date, etc. So, the measure needs to average by the sample dimension but needs to sum across the chemical compound dimension (To add up to 100% for example). Is this at all possible?

    Read the article

  • Dealing with big IF statements in PHP

    - by Industrial
    Hi everyone, Is there any good alternative for the plain if statements in PHP? I know about switch, but I'll guess that there's some more refined alternative out there that comes handy when working with really big if statements. Thanks a lot,

    Read the article

  • MySQL Non Index Queries Analysis

    - by Markii
    I'm using the log queries not using index but it logs all that use indexes but just more advanced or using IFs. Is there a parser or a program out there that can analyze the log and give me a literal output of saying "table.column should be a index" Thanks

    Read the article

  • Ignore designer and generated files in Resharper analysis

    - by RaYell
    I've been using Resharper for a few days and I really like this tool, but there's one thing that annoys me about it and I wonder if it can be changed. I'm getting lots of issue notifications from generated code (almost 1400 in my project). I'd want to set those files as ignored so they won't be checked as you can do with StyleCop and CodeAnalysis. Unfortunatelly it looks like Resharper ignores Generated Code settings from it's options because I'm still getting the same notifications. I've tried setting a file mask (i.e. for *.resx) and add files manually to generated, but still it doesn't change anything. I don't know if it matters but I'm using VS 2010.

    Read the article

  • Image analysis on the iPhone

    - by user362808
    Hi, guys! Hopefully a quick one. Working on a devious little algorithm that'll automatically scan a user's iPhone for a photo of some emotional poignance (e.g. go back in time a ways, look for one of a group of photos that are proximally timestamped, etc.). Need a quick-and-dirty way of picking out a photo containing two people in close proximity to each other. I know that the OpenCV Python lib [for image processing] works on the iPhone, I'm just curious whether it's actually the best way of doing this (from scratch or otherwise). Cheers!

    Read the article

  • Sum and average over null values SQL Server 2008 Analysis Services

    - by Jonathan
    I have a simple problem, I think, but I have googled and can't find the solution. I have a cube that has MeasureA, MeasureB and MeasureC. Not all three measures have values for each record, sometimes they can be null, it's depending if it was applicable. Now for my totals, I need to average but the average must not take nulls into account. Any help will be much appreciated. When I view the measures, the null values show as zeros.

    Read the article

  • I want to read a big text file

    - by Rozer
    I want to read a big text file, what i decided to create four threads and read 25% of file by each one. and then join them. but its not more impressive. can any one tell me can i use concurrent programming for the same. as my file structure have some data as name contact compnay policyname policynumber uniqueno and I want to put all data in hashmap at last. thanks

    Read the article

  • How big in bytes is a DataRow

    - by JeffreyABecker
    I have a one-time process (hashing all our user passwords) written using datasets. The performance needs improvement so we've profiled the application and found that increasing the 'batch size' of the update will improve performance. I also know that if I load the entire data set into memory the application will start hitting swap and slow down. The question is: how big is a System.Data.DataRow derived class? I'd like to calculate a batch size which I know won't force the application into swap.

    Read the article

  • Performance analysis for java application

    - by user1827614
    I want to do a performance measurement of my application and would like to be able to configure the stats for specific module like (enable for specific module and disable for some) and I want to measure things like memory usage, threads, average band width etc.. Can any one suggest something please, I am new to this. I think Visual VM is good but it doesnot support configuring for different modules. Does Perf4j or Admin4j work here? any one has used these before?

    Read the article

  • ?# string analysis

    - by Sergei
    I have a string for example like " :)text :)text:) :-) word :-( " i need append it in textbox(or somewhere else), with condition: Instead of ':)' ,':-(', etc. need to call function which enter specific symbol I thinck exists solution with Finite-state machine, but how implement it don't know. Waiting for advises.

    Read the article

  • To divide the big text into columns.

    - by kalininew
    Problem: There is a big piece of the text: <div class="cont"> <p> Sed ut perspiciatis, unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam eaque ipsa, quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt, explicabo. Nemo enim ipsam voluptatem, quia voluptas sit, aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos, qui ratione voluptatem sequi nesciunt, neque porro quisquam est, qui dolorem ipsum, quia dolor sit, amet, </p> <p> consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt, ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit, qui in ea voluptate velit esse, quam nihil molestiae consequatur, vel illum, qui dolorem eum fugiat, quo voluptas nulla pariatur? At vero eos et </p> <p> accusamus et iusto odio dignissimos ducimus, qui blanditiis praesentium voluptatum deleniti atque corrupti, quos dolores et quas molestias excepturi sint, obcaecati cupiditate non provident, similique sunt in culpa, qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio, cumque nihil impedit, quo minus id, quod maxime placeat, </p> <p> facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Temporibus autem quibusdam et aut officiis debitis aut rerum necessitatibus saepe eveniet, ut et voluptates repudiandae sint et molestiae non recusandae. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat. </p> <p> Sed ut perspiciatis, unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam eaque ipsa, quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt, explicabo. Nemo enim ipsam voluptatem, quia voluptas sit, aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos, qui ratione voluptatem sequi nesciunt, neque porro quisquam est, qui dolorem ipsum, quia dolor sit, amet, </p> <p> consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt, ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit, qui in ea voluptate velit esse, quam nihil molestiae consequatur, vel illum, qui dolorem eum fugiat, quo voluptas nulla pariatur? At vero eos et </p> <p> accusamus et iusto odio dignissimos ducimus, qui blanditiis praesentium voluptatum deleniti atque corrupti, quos dolores et quas molestias excepturi sint, obcaecati cupiditate non provident, similique sunt in culpa, qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio, cumque nihil impedit, quo minus id, quod maxime placeat, </p> <p> facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Temporibus autem quibusdam et aut officiis debitis aut rerum necessitatibus saepe eveniet, ut et voluptates repudiandae sint et molestiae non recusandae. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat. </p> </div> It is necessary: To divide it on two columns. On page it should be divided on two about identical (on height) columns. If it is possible: at change of the sizes of the container of the text, a column should remain identical height. Whether probably to set number "n" - on how many columns to divide the big piece of the text. That is to divide the text into any number of columns. Is available: php, xslt, css, pure javascript (without jQuey). What tool is better for using? As it to make, that the decision was ?ross browser compatible.

    Read the article

  • PL/SQL pre-compile and Code Quality checks in an automatted build environment?

    - by Lars Corneliussen
    We build software using Hudson and Maven. We have C#, java and last, but not least PL/SQL sources (sprocs, packages, DDL, crud) For C# and Java we do unit tests and code analysis, but we don't really know the health of our PL/SQL sources before we actually publish them to the target database. Requirements There are a couple of things we wan't to test in the following priority: Are the sources valid, hence "compilable"? For packages, with respect to a certain database, would they compile? Code Quality: Do we have code flaws like duplicates, too complex methods or other violations to a defined set of rules? Also, the tool must run head-less (commandline, ant, ...) we wan't to do analysis on a partial code base (changed sources only) Tools We did a little research and found the following tools that could potencially help: Cast Application Intelligence Platform (AIP): Seems to be a server that grasps information about "anything". Couldn't find a console version that would export in readable format. Toad for Oracle: The Professional version is said to include something called Xpert validates a set of rules against a code base. Sonar + PL/SQL-Plugin: Uses Toad for Oracle to display code-health the sonar-way. This is for browsing the current state of the code base. Semantic Designs DMSToolkit: Quite general analysis of source code base. Commandline available? Semantic Designs Clones Detector: Detects clones. But also via command line? Fortify Source Code Analyzer: Seems to be focussed on security issues. But maybe it is extensible? more... So far, Toad for Oracle together with Sonar seems to be an elegant solution. But may be we are missing something here? Any ideas? Other products? Experiences? Related Questions on SO: http://stackoverflow.com/questions/531430/any-static-code-analysis-tools-for-stored-procedures http://stackoverflow.com/questions/839707/any-code-quality-tool-for-pl-sql http://stackoverflow.com/questions/956104/is-there-a-static-analysis-tool-for-python-ruby-sql-cobol-perl-and-pl-sql

    Read the article

  • Using Image Source with big images in WPF

    - by xyzzer
    I am working on an application that allows users to manipulate multiple images by using ItemsControl. I started running some tests and found that the app has problems displaying some big images - ie. it did not work with the high resolution (21600x10800), 20MB images from http://earthobservatory.nasa.gov/Features/BlueMarble/BlueMarble_monthlies.php, though it displays the 6200x6200, 60MB Hubble telescope image from http://zebu.uoregon.edu/hudf/hudf.jpg just fine. The original solution just specified an Image control with a Source property pointing at a file on a disk (through a binding). With the Blue Marble file - the image would just not show up. Now this could be just a bug hidden somewhere deep in the funky MVVM + XAML implementation - the visual tree displayed by Snoop goes like: Window/Border/AdornerDecorator/ContentPresenter/Grid/Canvas/UserControl/Border/ContentPresenter/Grid/Grid/Grid/Grid/Border/Grid/ContentPresenter/UserControl/UserControl/Border/ContentPresenter/Grid/Grid/Grid/Grid/Viewbox/ContainerVisual/UserControl/Border/ContentPresenter/Grid/Grid/ItemsControl/Border/ItemsPresenter/Canvas/ContentPresenter/Grid/Grid/ContentPresenter/Image... Now debug this! WPF can be crazy like that... Anyway, it turned out that if I create a simple WPF application - the images load just fine. I tried finding out the root cause, but I don't want to spend weeks on it. I figured the right thing to do might be to use a converter to scale the images down - this is what I have done: ImagePath = @"F:\Astronomical\world.200402.3x21600x10800.jpg"; TargetWidth = 2800; TargetHeight = 1866; and <Image> <Image.Source> <MultiBinding Converter="{StaticResource imageResizingConverter}"> <MultiBinding.Bindings> <Binding Path="ImagePath"/> <Binding RelativeSource="{RelativeSource Self}" /> <Binding Path="TargetWidth"/> <Binding Path="TargetHeight"/> </MultiBinding.Bindings> </MultiBinding> </Image.Source> </Image> and public class ImageResizingConverter : MarkupExtension, IMultiValueConverter { public Image TargetImage { get; set; } public string SourcePath { get; set; } public int DecodeWidth { get; set; } public int DecodeHeight { get; set; } public object Convert(object[] values, Type targetType, object parameter, CultureInfo culture) { this.SourcePath = values[0].ToString(); this.TargetImage = (Image)values[1]; this.DecodeWidth = (int)values[2]; this.DecodeHeight = (int)values[3]; return DecodeImage(); } private BitmapImage DecodeImage() { BitmapImage bi = new BitmapImage(); bi.BeginInit(); bi.DecodePixelWidth = (int)DecodeWidth; bi.DecodePixelHeight = (int)DecodeHeight; bi.UriSource = new Uri(SourcePath); bi.EndInit(); return bi; } public object[] ConvertBack(object value, Type[] targetTypes, object parameter, CultureInfo culture) { throw new Exception("The method or operation is not implemented."); } public override object ProvideValue(IServiceProvider serviceProvider) { return this; } } Now this works fine, except for one "little" problem. When you just specify a file path in Image.Source - the application actually uses less memory and works faster than if you use BitmapImage.DecodePixelWidth. Plus with Image.Source if you have multiple Image controls that point to the same image - they only use as much memory as if only one image was loaded. With the BitmapImage.DecodePixelWidth solution - each additional Image control uses more memory and each of them uses more than when just specifying Image.Source. Perhaps WPF somehow caches these images in compressed form while if you specify the decoded dimensions - it feels like you get an uncompressed image in memory, plus it takes 6 times the time (perhaps without it the scaling is done on the GPU?), plus it feels like the original high resolution image also gets loaded and takes up space. If I just scale the image down, save it to a temporary file and then use Image.Source to point at the file - it will probably work, but it will be pretty slow and it will require handling cleanup of the temporary file. If I could detect an image that does not get loaded properly - maybe I could only scale it down if I need to, but Image.ImageFailed never gets triggered. Maybe it has something to do with the video memory and this app just using more of it with the deep visual tree, opacity masks etc. Actual question: How can I load big images as quickly as Image.Source option does it, without using more memory for additional copies and additional memory for the scaled down image if I only need them at a certain resolution lower than original? Also, I don't want to keep them in memory if no Image control is using them anymore.

    Read the article

  • Readdirectorychanges not working with big file

    - by sanky
    i was working with readdirectorychangesW till i encountered an unwanted behavior.below is short reference the way i am using my readdirectorychanges func. The behavior is that in case a big file in GBs is copied to the watched directory i get the notification immediately and thereby i start doing my operation according to the notification i receive ,but the file IO ( copy operation is still pending) and that results in unxpected state. Can anyone please suggest a way to synchronize this operation. I want the copy operation(that gave me notification) to get completed first then i want to do my processing. while(ReadDirectoryChangesW( hDir, pBuffer, nBufSize, FALSE, FILE_NOTIFY_CHANGE_FILE_NAME, &BytesReturned, NULL, NULL )) { pBufferCurrent = pBuffer; while(pBufferCurrent) { switch(pBufferCurrent->Action) { case FILE_ACTION_ADDED: break; default: break; } if (pBufferCurrent->NextEntryOffset) pBufferCurrent = (FILE_NOTIFY_INFORMATION*)(((oef_u8_t*)pBufferCurrent) + pBufferCurrent->NextEntryOffset); else pBufferCurrent = NULL; }

    Read the article

  • Python class design - Splitting up big classes into multiple ones to group functionality

    - by Ivo Wetzel
    OK I've got 2 really big classes 1k lines each that I currently have split up into multiple ones. They then get recombined using multiple inheritance. Now I'm wondering, if there is any cleaner/better more pythonic way of doing this. Completely factoring them out would result in endless amounts of self.otherself.do_something calls, which I don't think is the way it should be done. To make things clear here's what it currently looks like: from gui_events import GUIEvents # event handlers from gui_helpers import GUIHelpers # helper methods that don't directly modify the GUI # GUI.py class GUI(gtk.Window, GUIEvents, GUIHelpers): # general stuff here stuff here One problem that is result of this is Pylint complaining giving me trillions of "init not called" / "undefined attribute" / "attribute accessed before definition" warnings.

    Read the article

  • MIPS assembly: big and little endian confusion

    - by Barney
    I've run the following code snippet on the MIPS MARS simulator. That simulator is little endian. So the results are as follows: lui $t0,0x1DE # $t0 = 0x01DE0000 ori $t0,$t0,0xCADE # $t0 = 0x01DECADE lui $t1,0x1001 # $t1 = 0x10010000 sw $t0,200($t1) # $t1 + 200 bytes = 0x01DECADE lw $t2,200($t1) # $t2 = 0x01DECADE So on a little endian MIPS simulator, the value of $t2 at the end of the program is 0x01DECADE. If this simulator was big endian, what would the value be? Would it be 0xDECADE01 or would it still be 0x01DECADE?

    Read the article

  • How to geocoding big number of addresses?

    - by user308569
    I need to geocode, i.e. translate street address to latitude,longitude for ~8,000 street addresses. I am using both Yahoo and Google geocoding engines at http://www.gpsvisualizer.com/geocoder/, and found out that for a big number of addresses those engines (one of them or both) either could not perform geocoding (i.e.return latitude=0,longitude=0), or return wrong coordinates (incl. cases when Yahoo and Google give different results). What is the best way to handle this problem? Which engine is (usually) more accurate? I would appreciate any thoughts, suggestions, ideas from people who had previous experience with this kind of task.

    Read the article

  • Building big, immutable objects without constructors having long parameter lists

    - by Malax
    Hi StackOverflow! I have some big (more than 3 fields) Objects which can and should be immutable. Every time I run into that case i tend to create constructor abominations with long parameter lists. It doesn't feel right, is hard to use and readability suffers. It is even worse if the fields are some sort of collection type like lists. A simple addSibling(S s) would ease the object creation so much but renders the object mutable. What do you guys use in such cases? I'm on Scala and Java, but i think the problem is language agnostic as long as the language is object oriented. Solutions I can think of: "Constructor abominations with long parameter lists" The Builder Pattern Thanks for your input!

    Read the article

  • My Javascript Object is too big

    - by Kubi
    I am creating a really big javascript object on page load. I am getting no error on firefox but on Internet Explorer I am getting an error saying that- Stop running this script ? A script on this page is causing your web browser to run slowly. If it continues to run, your computer might become unresponsive. Is there any size limit for Javascript objects in Internet Explorer ? Any other solutions but not dividing the object ? Any help would be greatly appreciated,

    Read the article

  • My Javascript Object is too big

    - by Kubi
    I am creating a really big javascript object on page load. I am getting no error on firefox but on Internet Explorer I am getting an error saying that- Stop running this script ? A script on this page is causing your web browser to run slowly. If it continues to run, your computer might become unresponsive. Is there any size limit for Javascript objects in Internet Explorer ? Any other solutions but not dividing the object ? Any help would be greatly appreciated,

    Read the article

  • Building big, immutable objects without using constructors having long parameter lists

    - by Malax
    Hi StackOverflow! I have some big (more than 3 fields) Objects which can and should be immutable. Every time I run into that case i tend to create constructor abominations with long parameter lists. It doesn't feel right, is hard to use and readability suffers. It is even worse if the fields are some sort of collection type like lists. A simple addSibling(S s) would ease the object creation so much but renders the object mutable. What do you guys use in such cases? I'm on Scala and Java, but i think the problem is language agnostic as long as the language is object oriented. Solutions I can think of: "Constructor abominations with long parameter lists" The Builder Pattern Thanks for your input!

    Read the article

  • Difference between C# and java big endian bytes using miscutil

    - by Eric Hauser
    I'm using the miscutil library to communicate between and Java and C# application using a socket. I am trying to figure out the difference between the following code (this is Groovy, but the Java result is the same): import java.io.* def baos = new ByteArrayOutputStream(); def stream = new DataOutputStream(baos); stream.writeInt(5000) baos.toByteArray().each { println it } /* outputs - 0, 0, 19, -120 */ and C#: using (var ms = new MemoryStream()) using (EndianBinaryWriter writer = new EndianBinaryWriter(EndianBitConverter.Big, ms, Encoding.UTF8)) { writer.Write(5000); ms.Position = 0; foreach (byte bb in ms.ToArray()) { Console.WriteLine(bb); } } /* outputs - 0, 0, 19, 136 */ As you can see, the last byte is -120 in the Java version and 136 in C#.

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >