Search Results

Search found 39456 results on 1579 pages for 'why'.

Page 82/1579 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • Why membership provider is not generic?

    - by Timmy O' Tool
    I have to confess that I hate membership provider. The default implementation is not very appropriate normally and I haven't seen so far a good implementation of a custom membership provider, probably because this is not possible :-) So the question is: In your opinion: which are the reasons for not having membership/role provider as a generic class? I mean, why Microsoft didn't selected this approach.

    Read the article

  • why the class of subclass is superclass

    - by Raj
    def singleton_class class << self self end end end class Human proc = lambda { puts 'proc says my class is ' + self.name.to_s } singleton_class.instance_eval do define_method(:lab) do proc.call end end end class Developer < Human end Human.lab # class is Human Developer.lab # class is Human ; oops Why Developer.lab is reporting that it is Human ?

    Read the article

  • Why use Apache over NGINX/Cherokee/Lighttpd?

    - by codysoyland
    Apache has been the de facto standard web server for over a decade, but recent years have brought us web servers that consume less RAM and handle many more requests per second using fewer threads and asynchronous i/o. In my opinion, I also find the configuration of these servers to be more straightforward and minimal. Why do people use Apache when asynchronous servers are so much more lightweight? Is there any clear benefit?

    Read the article

  • Do you use an exception class in your Perl programs? Why or why not?

    - by daotoad
    I've got a bunch of questions about how people use exceptions in Perl. I've included some background notes on exceptions, skip this if you want, but please take a moment to read the questions and respond to them. Thanks. Background on Perl Exceptions Perl has a very basic built-in exception system that provides a spring-board for more sophisticated usage. For example die "I ate a bug.\n"; throws an exception with a string assigned to $@. You can also throw an object, instead of a string: die BadBug->new('I ate a bug.'); You can even install a signal handler to catch the SIGDIE psuedo-signal. Here's a handler that rethrows exceptions as objects if they aren't already. $SIG{__DIE__} = sub { my $e = shift; $e = ExceptionObject->new( $e ) unless blessed $e; die $e; } This pattern is used in a number of CPAN modules. but perlvar says: Due to an implementation glitch, the $SIG{DIE} hook is called even inside an eval(). Do not use this to rewrite a pending exception in $@ , or as a bizarre substitute for overriding CORE::GLOBAL::die() . This strange action at a distance may be fixed in a future release so that $SIG{DIE} is only called if your program is about to exit, as was the original intent. Any other use is deprecated. So now I wonder if objectifying exceptions in sigdie is evil. The Questions Do you use exception objects? If so, which one and why? If not, why not? If you don't use exception objects, what would entice you to use them? If you do use exception objects, what do you hate about them, and what could be better? Is objectifying exceptions in the DIE handler a bad idea? Where should I objectify my exceptions? In my eval{} wrapper? In a sigdie handler? Are there any papers, articles or other resources on exceptions in general and in Perl that you find useful or enlightening.

    Read the article

  • Why does /**[newline] not always insert the Javadoc template including @param and @return in Eclipse

    - by Bas van den Broek
    I'm documenting code in Eclipse and have been using the /** followed by Enter alot to insert the Javadoc template. However this does not always work for some reason, it will create the template for writing comments but it won't automatically insert the @param and @return text. If I copy the exact same method to another class it will insert the full template. It would be a big help if anyone could tell me why it won't do this in some situations.

    Read the article

  • Why enabling transparency can lead to cliping problems ?

    - by Amokrane
    Hi, I'm working on a 3D graphical application in Java using the Java 3D API. I noticed that every time I was dealing with transparency, all I got in return were some clipping problems. Some parts of the scene weren't displayed properly. It might seem obvious that this would happen in a certain way but I'm looking for a logical explanation, why is this happening? Thank you

    Read the article

  • Why C/C++ have memory issue?

    - by LinuxNewbie
    I have read lots of programmers saying and writing when programming in C/C++ there are lots of issue related to memory. I am planning to learn to program in C/C++. I have beginner knowledge of C/C++ and I want to see some short sample why C/C++ can have issues with memory management. Please Provide some samples.

    Read the article

  • Why is Serializable Attribute required for an object to be serialized

    - by Keivan
    Based on my understanding SerializableAttribute provides no compile time checks, as its all done at runtime, then why is it required for classes to be marked as serializable? Couldn't sterilizer just try to serialize an object and just fail? isn't it what it does right-now when something is marked, it tries and fails. wouldn't it be better if you had to mark things as unserializable rather than serializable? that way you wouldn't have the problem of libraries not marking things as serializable?

    Read the article

  • Why does Apple create it's views this way

    - by John Smith
    In the hope of fixing a bug of mine from another post i would like to know why apple writes this (for it's Elements example) UIView *localContainerView = [[UIView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]]; self.containerView = localContainerView; [localContainerView release]; instead of the simpler method: containerView = [[UIView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]]; ?

    Read the article

  • Why use buffers to read/write Streams

    - by James Hay
    Following reading various questions on reading and writing Streams, all the various answers define something like this as the correct way to do it: private void CopyStream(Stream input, Stream output) { byte[] buffer = new byte[16 * 1024]; int read; while ((read = input.Read(buffer, 0, buffer.Length)) > 0) { output.Write(buffer, 0, read); } } Two questions: Why read and write in these smaller chunks? What is the significance of the buffer size used?

    Read the article

  • Math.min.apply(0, x) - why?

    - by Trevor Burnham
    I was just digging through some JavaScript code (Raphaël.js) and came across the following line (translated slightly): Math.min.apply(0, x) where x is an array. Why on earth would you do this? The behavior seems to be "take the min from the array x."

    Read the article

  • Why do we have so many programming-languages?

    - by ntsbjctve
    Most people would probably answer with "You won't build a house using only a hammer", but my argument against this is: There is also only one real mathematical language used for everything from chemical to architectural calculations, and as programming-languages are in many ways similar to maths, why should it be so different with them?

    Read the article

  • Why can't create object of an abstract class?

    - by Gaurav
    Here is a scenario in my mind and I have googled, Binged it a lot but got the answer like "Abstract class has not implemented method so, we cant create the object" "The word 'Abstract' instruct the clr that not to create object of the class" But in a simple class where we have all virtual method, able to create an object??? Also, we can define different access modified to Abstract class constructor like private, protected or public. My search terminated to this question ; Why we can't create object of an Abstract class?

    Read the article

  • Why doesn't TextBlock databinding call ToString() on a property whose compile-time type is an interf

    - by Jay
    This started with weird behaviour that I thought was tied to my implementation of ToString(), and I asked this question: http://stackoverflow.com/questions/2916068/why-wont-wpf-databindings-show-text-when-tostring-has-a-collaborating-object It turns out to have nothing to do with collaborators and is reproducible. When I bind Label.Content to a property of the DataContext that is declared as an interface type, ToString() is called on the runtime object and the label displays the result. When I bind TextBlock.Text to the same property, ToString() is never called and nothing is displayed. But, if I change the declared property to a concrete implementation of the interface, it works as expected. Is this somehow by design? If so, any idea why? To reproduce: Create a new WPF Application (.NET 3.5 SP1) Add the following classes: public interface IFoo { string foo_part1 { get; set; } string foo_part2 { get; set; } } public class Foo : IFoo { public string foo_part1 { get; set; } public string foo_part2 { get; set; } public override string ToString() { return foo_part1 + " - " + foo_part2; } } public class Bar { public IFoo foo { get { return new Foo {foo_part1 = "first", foo_part2 = "second"}; } } } Set the XAML of Window1 to: <Window x:Class="WpfApplication1.Window1" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="Window1" Height="300" Width="300"> <StackPanel> <Label Content="{Binding foo, Mode=Default}"/> <TextBlock Text="{Binding foo, Mode=Default}"/> </StackPanel> </Window> in Window1.xaml.cs: public partial class Window1 : Window { public Window1() { InitializeComponent(); DataContext = new Bar(); } } When you run this application, you'll see the text only once (at the top, in the label). If you change the type of foo property on Bar class to Foo (instead of IFoo) and run the application again, you'll see the text in both controls.

    Read the article

  • Why Windows Live Spaces Fetch Image Through HTTPS?

    - by Morgan Cheng
    I happens to find that, when a live space page is loaded, inline images are fetched by https protocol instead of http protocol. This doesn't make sense. The text part of live space is not fetched by https, why images are fetched with https? I bet the https way to fetch image just make the page loaded slower. Is there any special advantage to choose https over http in this case?

    Read the article

  • Why border of <tr> not showing in IE?

    - by metal-gear-solid
    Why border of tfoot tr:first-child not showing in IE. I'm checking in IE7. font-weight:bold; background:yellow is showing in IE but border not table { border-collapse: collapse; border-spacing: 0; } table tfoot tr:first-child {font-weight:bold; background:yellow; border-top:2px solid red; border-bottom:2px solid red;}

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >