Search Results

Search found 9062 results on 363 pages for 'big papoo'.

Page 16/363 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • Predictive vs Least Connection Load Balancing Techniques

    - by Mani
    I have a windows based desktop application that communicates via TCP to the application servers. (windows 2003). No sticky sessions between client calls. We have exactly 2 servers to load balance and we are thinking to use a F5 hardware NLB. The application is a heavy load types, doing not much bussiness logic in the services but retrieving quite a big amount of data at most of the times. May be on an average 5000 to 10000 records at all times. Used mainly for storing and retirieving data and no special processing of data or calculations running on the server side. I am favouring 'predictive' considering my services take a while at times to return data and hence tracking the feedback would yield some better routing as in predictive. I am not sure if the given data is sufficient enough to suggest some ideas but considering these, what would be some suggestions\things to consider\best between Predictive and Least Connections ? Thanks.

    Read the article

  • F5 BigIP upgrade from 9.x to 10.x

    - by mbuk2k
    Having a few difficulties upgrading a Big IP 3400 from 9.4.8 to any version 10.x image. The following are the versions I've tried: 10.1.0.3341.0 10.2.2.763.3 10.2.3.112.0 10.2.4.577.0 To upgrade I'm running the following command: image2disk --format=volumes BIGIP-10.1.0.3341.0.iso Obviously replacing the version number with the relevant image I'm trying to upgrade to each time. The F5 reboots, and starts copying packages however after 30 seconds or so just stops copying. The cursor in the console is still flashing but no matter how long it's left, the package doesn't copy. It seems to be a different package with each version/image (but always the same package per version) at point of freezing, which I'm guessing is suggesting a space issue? I've checked free space on the device and it has over 2GB free at root which should surely be enough? If anyone has any advice or pointers, it would be kindly appreciated. Thank you

    Read the article

  • UI template with big controls

    - by Bumble Bee
    Im not sure if this question is appropriate to go in here but after some hard effort in google I had no option but to post this here. I'm in the process of doing some UIs for a touch screen, but not sure how the template should look like. e.g. how big the buttons/text/labels should be. If anyone has experience in doing s similar stuff, pls share some references you have. thanks BB

    Read the article

  • Resources (resx) maintenance in big projects

    - by exalted
    In a big project where you have lots of resources (resx) what would be the right approaches and/or tools of translation in order to save time, at the same time keep everything in order and leave nothing behind. More precisely, how to find (as far as translations are concerned) what has been modified (adds and removes are easy) of an entire .NET application from a previous version to another? Would resx technology help you there? How?

    Read the article

  • Dealing with big IF statements in PHP

    - by Industrial
    Hi everyone, Is there any good alternative for the plain if statements in PHP? I know about switch, but I'll guess that there's some more refined alternative out there that comes handy when working with really big if statements. Thanks a lot,

    Read the article

  • I want to read a big text file

    - by Rozer
    I want to read a big text file, what i decided to create four threads and read 25% of file by each one. and then join them. but its not more impressive. can any one tell me can i use concurrent programming for the same. as my file structure have some data as name contact compnay policyname policynumber uniqueno and I want to put all data in hashmap at last. thanks

    Read the article

  • How big in bytes is a DataRow

    - by JeffreyABecker
    I have a one-time process (hashing all our user passwords) written using datasets. The performance needs improvement so we've profiled the application and found that increasing the 'batch size' of the update will improve performance. I also know that if I load the entire data set into memory the application will start hitting swap and slow down. The question is: how big is a System.Data.DataRow derived class? I'd like to calculate a batch size which I know won't force the application into swap.

    Read the article

  • To divide the big text into columns.

    - by kalininew
    Problem: There is a big piece of the text: <div class="cont"> <p> Sed ut perspiciatis, unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam eaque ipsa, quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt, explicabo. Nemo enim ipsam voluptatem, quia voluptas sit, aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos, qui ratione voluptatem sequi nesciunt, neque porro quisquam est, qui dolorem ipsum, quia dolor sit, amet, </p> <p> consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt, ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit, qui in ea voluptate velit esse, quam nihil molestiae consequatur, vel illum, qui dolorem eum fugiat, quo voluptas nulla pariatur? At vero eos et </p> <p> accusamus et iusto odio dignissimos ducimus, qui blanditiis praesentium voluptatum deleniti atque corrupti, quos dolores et quas molestias excepturi sint, obcaecati cupiditate non provident, similique sunt in culpa, qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio, cumque nihil impedit, quo minus id, quod maxime placeat, </p> <p> facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Temporibus autem quibusdam et aut officiis debitis aut rerum necessitatibus saepe eveniet, ut et voluptates repudiandae sint et molestiae non recusandae. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat. </p> <p> Sed ut perspiciatis, unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam eaque ipsa, quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt, explicabo. Nemo enim ipsam voluptatem, quia voluptas sit, aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos, qui ratione voluptatem sequi nesciunt, neque porro quisquam est, qui dolorem ipsum, quia dolor sit, amet, </p> <p> consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt, ut labore et dolore magnam aliquam quaerat voluptatem. Ut enim ad minima veniam, quis nostrum exercitationem ullam corporis suscipit laboriosam, nisi ut aliquid ex ea commodi consequatur? Quis autem vel eum iure reprehenderit, qui in ea voluptate velit esse, quam nihil molestiae consequatur, vel illum, qui dolorem eum fugiat, quo voluptas nulla pariatur? At vero eos et </p> <p> accusamus et iusto odio dignissimos ducimus, qui blanditiis praesentium voluptatum deleniti atque corrupti, quos dolores et quas molestias excepturi sint, obcaecati cupiditate non provident, similique sunt in culpa, qui officia deserunt mollitia animi, id est laborum et dolorum fuga. Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio, cumque nihil impedit, quo minus id, quod maxime placeat, </p> <p> facere possimus, omnis voluptas assumenda est, omnis dolor repellendus. Temporibus autem quibusdam et aut officiis debitis aut rerum necessitatibus saepe eveniet, ut et voluptates repudiandae sint et molestiae non recusandae. Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur aut perferendis doloribus asperiores repellat. </p> </div> It is necessary: To divide it on two columns. On page it should be divided on two about identical (on height) columns. If it is possible: at change of the sizes of the container of the text, a column should remain identical height. Whether probably to set number "n" - on how many columns to divide the big piece of the text. That is to divide the text into any number of columns. Is available: php, xslt, css, pure javascript (without jQuey). What tool is better for using? As it to make, that the decision was ?ross browser compatible.

    Read the article

  • Using Image Source with big images in WPF

    - by xyzzer
    I am working on an application that allows users to manipulate multiple images by using ItemsControl. I started running some tests and found that the app has problems displaying some big images - ie. it did not work with the high resolution (21600x10800), 20MB images from http://earthobservatory.nasa.gov/Features/BlueMarble/BlueMarble_monthlies.php, though it displays the 6200x6200, 60MB Hubble telescope image from http://zebu.uoregon.edu/hudf/hudf.jpg just fine. The original solution just specified an Image control with a Source property pointing at a file on a disk (through a binding). With the Blue Marble file - the image would just not show up. Now this could be just a bug hidden somewhere deep in the funky MVVM + XAML implementation - the visual tree displayed by Snoop goes like: Window/Border/AdornerDecorator/ContentPresenter/Grid/Canvas/UserControl/Border/ContentPresenter/Grid/Grid/Grid/Grid/Border/Grid/ContentPresenter/UserControl/UserControl/Border/ContentPresenter/Grid/Grid/Grid/Grid/Viewbox/ContainerVisual/UserControl/Border/ContentPresenter/Grid/Grid/ItemsControl/Border/ItemsPresenter/Canvas/ContentPresenter/Grid/Grid/ContentPresenter/Image... Now debug this! WPF can be crazy like that... Anyway, it turned out that if I create a simple WPF application - the images load just fine. I tried finding out the root cause, but I don't want to spend weeks on it. I figured the right thing to do might be to use a converter to scale the images down - this is what I have done: ImagePath = @"F:\Astronomical\world.200402.3x21600x10800.jpg"; TargetWidth = 2800; TargetHeight = 1866; and <Image> <Image.Source> <MultiBinding Converter="{StaticResource imageResizingConverter}"> <MultiBinding.Bindings> <Binding Path="ImagePath"/> <Binding RelativeSource="{RelativeSource Self}" /> <Binding Path="TargetWidth"/> <Binding Path="TargetHeight"/> </MultiBinding.Bindings> </MultiBinding> </Image.Source> </Image> and public class ImageResizingConverter : MarkupExtension, IMultiValueConverter { public Image TargetImage { get; set; } public string SourcePath { get; set; } public int DecodeWidth { get; set; } public int DecodeHeight { get; set; } public object Convert(object[] values, Type targetType, object parameter, CultureInfo culture) { this.SourcePath = values[0].ToString(); this.TargetImage = (Image)values[1]; this.DecodeWidth = (int)values[2]; this.DecodeHeight = (int)values[3]; return DecodeImage(); } private BitmapImage DecodeImage() { BitmapImage bi = new BitmapImage(); bi.BeginInit(); bi.DecodePixelWidth = (int)DecodeWidth; bi.DecodePixelHeight = (int)DecodeHeight; bi.UriSource = new Uri(SourcePath); bi.EndInit(); return bi; } public object[] ConvertBack(object value, Type[] targetTypes, object parameter, CultureInfo culture) { throw new Exception("The method or operation is not implemented."); } public override object ProvideValue(IServiceProvider serviceProvider) { return this; } } Now this works fine, except for one "little" problem. When you just specify a file path in Image.Source - the application actually uses less memory and works faster than if you use BitmapImage.DecodePixelWidth. Plus with Image.Source if you have multiple Image controls that point to the same image - they only use as much memory as if only one image was loaded. With the BitmapImage.DecodePixelWidth solution - each additional Image control uses more memory and each of them uses more than when just specifying Image.Source. Perhaps WPF somehow caches these images in compressed form while if you specify the decoded dimensions - it feels like you get an uncompressed image in memory, plus it takes 6 times the time (perhaps without it the scaling is done on the GPU?), plus it feels like the original high resolution image also gets loaded and takes up space. If I just scale the image down, save it to a temporary file and then use Image.Source to point at the file - it will probably work, but it will be pretty slow and it will require handling cleanup of the temporary file. If I could detect an image that does not get loaded properly - maybe I could only scale it down if I need to, but Image.ImageFailed never gets triggered. Maybe it has something to do with the video memory and this app just using more of it with the deep visual tree, opacity masks etc. Actual question: How can I load big images as quickly as Image.Source option does it, without using more memory for additional copies and additional memory for the scaled down image if I only need them at a certain resolution lower than original? Also, I don't want to keep them in memory if no Image control is using them anymore.

    Read the article

  • Readdirectorychanges not working with big file

    - by sanky
    i was working with readdirectorychangesW till i encountered an unwanted behavior.below is short reference the way i am using my readdirectorychanges func. The behavior is that in case a big file in GBs is copied to the watched directory i get the notification immediately and thereby i start doing my operation according to the notification i receive ,but the file IO ( copy operation is still pending) and that results in unxpected state. Can anyone please suggest a way to synchronize this operation. I want the copy operation(that gave me notification) to get completed first then i want to do my processing. while(ReadDirectoryChangesW( hDir, pBuffer, nBufSize, FALSE, FILE_NOTIFY_CHANGE_FILE_NAME, &BytesReturned, NULL, NULL )) { pBufferCurrent = pBuffer; while(pBufferCurrent) { switch(pBufferCurrent->Action) { case FILE_ACTION_ADDED: break; default: break; } if (pBufferCurrent->NextEntryOffset) pBufferCurrent = (FILE_NOTIFY_INFORMATION*)(((oef_u8_t*)pBufferCurrent) + pBufferCurrent->NextEntryOffset); else pBufferCurrent = NULL; }

    Read the article

  • Python class design - Splitting up big classes into multiple ones to group functionality

    - by Ivo Wetzel
    OK I've got 2 really big classes 1k lines each that I currently have split up into multiple ones. They then get recombined using multiple inheritance. Now I'm wondering, if there is any cleaner/better more pythonic way of doing this. Completely factoring them out would result in endless amounts of self.otherself.do_something calls, which I don't think is the way it should be done. To make things clear here's what it currently looks like: from gui_events import GUIEvents # event handlers from gui_helpers import GUIHelpers # helper methods that don't directly modify the GUI # GUI.py class GUI(gtk.Window, GUIEvents, GUIHelpers): # general stuff here stuff here One problem that is result of this is Pylint complaining giving me trillions of "init not called" / "undefined attribute" / "attribute accessed before definition" warnings.

    Read the article

  • MIPS assembly: big and little endian confusion

    - by Barney
    I've run the following code snippet on the MIPS MARS simulator. That simulator is little endian. So the results are as follows: lui $t0,0x1DE # $t0 = 0x01DE0000 ori $t0,$t0,0xCADE # $t0 = 0x01DECADE lui $t1,0x1001 # $t1 = 0x10010000 sw $t0,200($t1) # $t1 + 200 bytes = 0x01DECADE lw $t2,200($t1) # $t2 = 0x01DECADE So on a little endian MIPS simulator, the value of $t2 at the end of the program is 0x01DECADE. If this simulator was big endian, what would the value be? Would it be 0xDECADE01 or would it still be 0x01DECADE?

    Read the article

  • How to geocoding big number of addresses?

    - by user308569
    I need to geocode, i.e. translate street address to latitude,longitude for ~8,000 street addresses. I am using both Yahoo and Google geocoding engines at http://www.gpsvisualizer.com/geocoder/, and found out that for a big number of addresses those engines (one of them or both) either could not perform geocoding (i.e.return latitude=0,longitude=0), or return wrong coordinates (incl. cases when Yahoo and Google give different results). What is the best way to handle this problem? Which engine is (usually) more accurate? I would appreciate any thoughts, suggestions, ideas from people who had previous experience with this kind of task.

    Read the article

  • Building big, immutable objects without constructors having long parameter lists

    - by Malax
    Hi StackOverflow! I have some big (more than 3 fields) Objects which can and should be immutable. Every time I run into that case i tend to create constructor abominations with long parameter lists. It doesn't feel right, is hard to use and readability suffers. It is even worse if the fields are some sort of collection type like lists. A simple addSibling(S s) would ease the object creation so much but renders the object mutable. What do you guys use in such cases? I'm on Scala and Java, but i think the problem is language agnostic as long as the language is object oriented. Solutions I can think of: "Constructor abominations with long parameter lists" The Builder Pattern Thanks for your input!

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >