Search Results

Search found 115 results on 5 pages for 'variance'.

Page 3/5 | < Previous Page | 1 2 3 4 5  | Next Page >

  • 1000 HZ linux kernel necessary if I have tickless and high resolution timer?

    - by Bob
    I am trying to improve performance on my server. I have a few processes that need low jitter (less than 10ms variance). I have a load average of 4 maximum on an i7-920 (4 physical cores, 8 with HT). There are about 10 processes ranging from 40% to 90% of a core user mode. System usage is 3% total. Total CPU usage is 80% max. Will setting the kernel from 100hz to 1000hz improve the jitter if tickless and high resolution timers are already set? This page seems to indicate it still does something. https://lkml.org/lkml/2009/4/28/401 How about changing from voluntary (PREEMPT_VOLUNTARY) to preemptible (PREEMPT)?

    Read the article

  • Looking for Unix tool/script that, given an input path, will compress every batch of uncompressed 100MB text files into a single gzip file

    - by newToFlume
    I have a dump of thousands of small text files (1-5MB) large, each containing lines of text. I need to "batch" them up, so that each batch is of a fixed size - say 100MB, and compress that batch. Now that batch could be: A single file that is just a 'cat' of the contents of the individual text files, or Just the individual text files themselves Caveats: unix split -b will not work here as I need to keep lines of text intact. Using the lines option is a bit complicated as there is a large variance in the number of bytes in each line. The files need not be a fixed size strictly, as long as it's within 5% of the requested size The lines are critical, and should not be lost: I need to confirm that the input made its way to output without loss - what rolling checksum (something like CRC32, BUT better/"stronger" in face of collisions) A script should do nicely, but this seems like a task someone has done before, and it would be nice to see some code (preferably python or ruby) that does atleast something similar.

    Read the article

  • Do game-theoretic considerations stand in the way of this market-based game-mechanic achieving its goals?

    - by BerndBrot
    Mechanic The mechanic is called "market manipulation" and is supposed to work like this: Players can enter the London Stock Exchange (LSE) LSE displays the stock prices of 8 to 10 companies and derivatives. This number is relatively small to ensure that players will collide in their efforts to manipulate the market in their favor. The prices are calculated based on real world prices of these companies and derivatives (in real time) any market manipulations that were conducted by the players any market corrections of the system Players can buy and sell shares with cash, a resource in the game, at current in-game market value Players can manipulate the market, i.e. let the price of a share either rise or fall, by some amount, over a certain period of time. Manipulating the market requires spending certain in-game resources and is therefore limited. The system continuously corrects market manipulations by letting the in-game prices converge towards their real world counterparts at a rate of 2% of the difference between the two per hour. Because of this market correction mechanism, pushing up prices (and screwing down prices) becomes increasingly difficult the higher (lower) the price already is. Goals Players are supposed to collide (and have incentives to collide) in their efforts to manipulate the market in their favor, especially when it comes to manipulation efforts by different groups. Prices should not resolve around any equilibrium points. The more variance the better. Band-wagoning should always involve risk (recognizing that prices start rising should not be a sure sign that they will keep rising so that everybody can make easy profits even when they don't manipulate the market themselves) Question Are there any game-theoretic considerations that prevent the mechanic from achieving these goals?

    Read the article

  • C# 4.0: Covariance And Contravariance In Generics Made Easy

    - by Paulo Morgado
    In my last post, I went through what is variance in .NET 4.0 and C# 4.0 in a rather theoretical way. Now, I’m going to try to make it a bit more down to earth. Given: class Base { } class Derived : Base { } Such that: Trace.Assert(typeof(Base).IsClass && typeof(Derived).IsClass && typeof(Base).IsGreaterOrEqualTo(typeof(Derived))); Covariance interface ICovariantIn<out T> { } Trace.Assert(typeof(ICovariantIn<Base>).IsGreaterOrEqualTo(typeof(ICovariantIn<Derived>))); Contravariance interface ICovariantIn<out T> { } Trace.Assert(typeof(IContravariantIn<Derived>).IsGreaterOrEqualTo(typeof(IContravariantIn<Base>))); Invariance interface IInvariantIn<T> { } Trace.Assert(!typeof(IInvariantIn<Base>).IsGreaterOrEqualTo(typeof(IInvariantIn<Derived>)) && !typeof(IInvariantIn<Derived>).IsGreaterOrEqualTo(typeof(IInvariantIn<Base>))); Where: public static class TypeExtensions { public static bool IsGreaterOrEqualTo(this Type self, Type other) { return self.IsAssignableFrom(other); } }

    Read the article

  • DevDays ‘00 The Netherlands day #2

    - by erwin21
    Day 2 of DevDays 2010 and again 5 interesting sessions at the World Forum in The Hague. The first session of the today in the big world forum theater was from Scott Hanselman, he gives a lap around .NET 4.0. In his way of presenting he talked about all kind of new features of .NET 4.0 like MEF, threading, parallel processing, changes and additions to the CLR and DLR, WPF and all new language features of .NET 4.0. After a small break it was ready for session 2 from Scott Allen about Tips, Tricks and Optimizations of LINQ. He talked about lazy and deferred executions, the difference between IQueryable and IEnumerable and the two flavors of LINQ syntax. The lunch was again very good prepared and delicious, but after that it was time for session 3 Web Vulnerabilities and Exploits from Alex Thissen. This was no normal session but more like a workshop, we decided what kind of subjects we discussed, the subjects where OWASP, XSS and other injections, validation, encoding. He gave some handy tips and tricks how to prevent such attacks. Session 4 was about the new features of C# 4.0 from Alex van Beek. He talked about Optional- en Named Parameters, Generic Co- en Contra Variance, Dynamic keyword and COM Interop features. He showed how to use them but also when not to use them. The last session of today and also the last session of DevDays 2010 was about WCF Best Practices from Gerben van Loon. He talked about 7 best practices that you must know when you are going to use WCF. With some quick demos he showed the problem and the solution for some common issues. It where two interesting days and next year i sure will be attending again.

    Read the article

  • Enterprise Architecture IS (should not be) Arbitrary

    - by pat.shepherd
    I took a look at a blog entry today by Jordan Braunstein where he comments on another blog entry titled “Yes, “Enterprise Architecture is Relative BUT it is not Arbitrary.”  The blog makes some good points such as the following: Lock 10 architects in 10 separate rooms; provide them all an identical copy of the same business, technical, process, and system requirements; have them design an architecture under the same rules and perspectives; and I guarantee your result will be 10 different architectures of varying degrees. SOA Today: Enterprise Architecture IS Arbitrary Agreed, …to a degree….but less so if all 10 truly followed one of the widely accepted EA frameworks. My thinking is that EA frameworks all focus on getting the business goals/vision locked down first as the primary drivers for decisions made lower down the architecture stack.  Many people I talk to, know about frameworks such as TOGAF, FEA, etc. but seldom apply the tenants to the architecture at hand.  We all seem to want to get right into the Visio diagrams and boxes and arrows and connecting protocols and implementation details and lions and tigers and bears (Oh, my!) too early. If done properly the Business, Application and Information architectures are nailed down BEFORE any technological direction (SOA or otherwise) is set.  Those 3 layers and Governance (people and processes), IMHO, are layers that should not vary much as they have everything to do with understanding the business -- from which technological conclusions can later be drawn. I really like what he went on to say later in the post about the fact that architecture attempts to remove the amount of variance between the 10 different architect’s work.  That is the real heart of what EA is about; REMOVING THE ARBRITRARITY.

    Read the article

  • How do you deal with translating theory into practice?

    - by Mr. Shickadance
    Hello all! Being a computer scientist in a research field I am often tasked with working alongside professionals outside of the software domain (think math people, electrical engineer etc), and then translating their theories and ideas into real-world implementations. I often find it difficult when they present a theoretical problem which appears to be somewhat disconnected from reality. I am not saying that the theory is bogus, only that it is difficult to translate into real-world situations. For example, recently I have been working with software defined radios. We are exploring many different areas, but often the math specialists in my group would present a problem which is heavily grounded in theory (signal processing, physics, whatever). I often struggle at times where it is hard to draw direct parallels between the theory and the real-world implementation that I need to develop. Say we are working on an energy detector, the theory person in my group would say "you need to measure the noise variance with no signal present." This leads me to think "how the hell do I isolate noise from a signal in reality?" There are many examples, but I hope you see where I am going. So, my question is how does one deal with implementation of theoretical concepts when the theory seems detached from reality? Or at least when the connections are not so clear. Or perhaps, the person with the 'theory' may be ignorant of real restrictions? Note: I found this to be a hard question to ask - hopefully you are following me. If you have suggestions on how I could improve it, by all means let me know! Thanks for looking! EDIT: To be a bit more clear, I understand in situations like this that I must learn that specific domain myself to an extent (i.e. signal processing), but I am more concerned with when those theoretical concepts do not appear to be as grounded in practice as one would like.

    Read the article

  • My Latest Books &ndash; Professional C# 2010 and Professional ASP.NET 4

    - by Bill Evjen
    My two latest books are out! Professional ASP.NET 4 in C# and VB Professional C# 4 and .NET 4 From the back covers: Take your web development to the next level using ASP.NET 4 ASP.NET is about making you as productive as possible when building fast and secure web applications. Each release of ASP.NET gets better and removes a lot of the tedious code that you previously needed to put in place, making common ASP.NET tasks easier. With this book, an unparalleled team of authors walks you through the full breadth of ASP.NET and the new and exciting capabilities of ASP.NET 4. The authors also show you how to maximize the abundance of features that ASP.NET offers to make your development process smoother and more efficient. Professional ASP.NET 4: Demonstrates ASP.NET built-in systems such as the membership and role management systems Covers everything you need to know about working with and manipulating data Discusses the plethora of server controls that are at your disposal Explores new ways to build ASP.NET, such as working with ASP.NET MVC and ASP.NET AJAX Examines the full life cycle of ASP.NET, including debugging and error handling, HTTP modules, the provider model, and more Features both printed and downloadable C# and VB code examples Start using the new features of C# 4 and .NET 4 right away The new C# 4 language version is indispensable for writing code in Visual Studio 2010. This essential guide emphasizes that C# is the language of choice for your .NET 4 applications. The unparalleled author team of experts begins with a refresher of C# basics and quickly moves on to provide detailed coverage of all the recently added language and Framework features so that you can start writing Windows applications and ASP.NET web applications immediately. Reviews the .NET architecture, objects, generics, inheritance, arrays, operators, casts, delegates, events, Lambda expressions, and more Details integration with dynamic objects in C#, named and optional parameters, COM-specific interop features, and type-safe variance Provides coverage of new features of .NET 4, Workflow Foundation 4, ADO.NET Data Services, MEF, the Parallel Task Library, and PLINQ Has deep coverage of great technologies including LINQ, WCF, WPF, flow and fixed documents, and Silverlight Reviews ASP.NET programming and goes into new features such as ASP.NET MVC and ASP.NET Dynamic Data Discusses communication with WCF, MSMQ, peer-to-peer, and syndication

    Read the article

  • I am not the most logically-organized person. Do I have any chance at being a good 'low-level' programmer?

    - by user217902
    Background: I am entering college next year. I really enjoy making stuff and solving logical problems, so I'm thinking of majoring in compsci and working in software development. I hope to have the kind of job where I can work with implementing / improving algorithms and data structures on a regular basis.. as opposed to, say, a job that's purely concerned with mashing different libraries together, or 'finding the right APIs for the job'. (Hence the word 'low-level' in the title. No, I don't wish to write assembly all day.) Thing is, I've never been the most logically-sharp person. Thus far I have only worked on hobby projects, but I find that I make the silliest of errors ever so often, and it can take me ages to find it. Like anywhere between three hours to a day to locate a simple segfault, off-by-one error, or other logical mistake. (Of course, I do other things in the meantime, like browsing SO, reddit, and the like..) It's not like I'm 'new' to programming either; I first tried C++ maybe five years ago. My question is: is this normal? Should a programmer with any talent solve it in less time? Having read Spolsky's Smart and gets things done, where he talks about the large variance in programming speed, am I near the bottom of the curve, and therefore destined to work in companies that cannot afford to hire quality programmers? I'd like to think that conceptually I'm okay -- I can grasp algorithms and concepts pretty well, I do fine in math and science, although I probably drop signs in my equations more often than the next guy. Still, grokking concepts makes me happy, and is the reason why I want to work with algorithms. I'm hoping to hear from those of you with real-world programming experience. TL;DR: I make many careless mistakes, should I not consider programming as a career?

    Read the article

  • Visual Studio 2010 and .NET 4 Released

    - by ScottGu
    The final release of Visual Studio 2010 and .NET 4 is now available. Download and Install Today MSDN subscribers, as well as WebsiteSpark/BizSpark/DreamSpark members, can now download the final releases of Visual Studio 2010 and TFS 2010 through the MSDN subscribers download center.  If you are not an MSDN Subscriber, you can download free 90-day trial editions of Visual Studio 2010.  Or you can can download the free Visual Studio express editions of Visual Web Developer 2010, Visual Basic 2010, Visual C# 2010 and Visual C++.  These express editions are available completely for free (and never time out).  If you are looking for an easy way to setup a new machine for web-development you can automate installing ASP.NET 4, ASP.NET MVC 2, IIS, SQL Server Express and Visual Web Developer 2010 Express really quickly with the Microsoft Web Platform Installer (just click the install button on the page). What is new with VS 2010 and .NET 4 Today’s release is a big one – and brings with it a ton of new feature and capabilities. One of the things we tried hard to focus on with this release was to invest heavily in making existing applications, projects and developer experiences better.  What this means is that you don’t need to read 1000+ page books or spend time learning major new concepts in order to take advantage of the release.  There are literally thousands of improvements (both big and small) that make you more productive and successful without having to learn big new concepts in order to start using them.  Below is just a small sampling of some of the improvements with this release: Visual Studio 2010 IDE  Visual Studio 2010 now supports multiple-monitors (enabling much better use of screen real-estate).  It has new code Intellisense support that makes it easier to find and use classes and methods. It has improved code navigation support for searching code-bases and seeing how code is called and used.  It has new code visualization support that allows you to see the relationships across projects and classes within projects, as well as to automatically generate sequence diagrams to chart execution flow.  The editor now supports HTML and JavaScript snippet support as well as improved JavaScript intellisense. The VS 2010 Debugger and Profiling support is now much, much richer and enables new features like Intellitrace (aka Historical Debugging), debugging of Crash/Dump files, and better parallel debugging.  VS 2010’s multi-targeting support is now much richer, and enables you to use VS 2010 to target .NET 2, .NET 3, .NET 3.5 and .NET 4 applications.  And the infamous Add Reference dialog now loads much faster. TFS 2010 is now easy to setup (you can now install the server in under 10 minutes) and enables great source-control, bug/work-item tracking, and continuous integration support.  Testing (both automated and manual) is now much, much richer.  And VS 2010 Premium and Ultimate provide much richer architecture and design tooling support. VB and C# Language Features VB and C# in VS 2010 both contain a bunch of new features and capabilities.  VB adds new support for automatic properties, collection initializers, and implicit line continuation support among many other features.  C# adds support for optional parameters and named arguments, a new dynamic keyword, co-variance and contra-variance, and among many other features. ASP.NET 4 and ASP.NET MVC 2 With ASP.NET 4, Web Forms controls now render clean, semantically correct, and CSS friendly HTML markup. Built-in URL routing functionality allows you to expose clean, search engine friendly, URLs and increase the traffic to your Website.  ViewState within applications can now be more easily controlled and made smaller.  ASP.NET Dynamic Data support has been expanded.  More controls, including rich charting and data controls, are now built-into ASP.NET 4 and enable you to build applications even faster.  New starter project templates now make it easier to get going with new projects.  SEO enhancements make it easier to drive traffic to your public facing sites.  And web.config files are now clean and simple. ASP.NET MVC 2 is now built-into VS 2010 and ASP.NET 4, and provides a great way to build web sites and applications using a model-view-controller based pattern. ASP.NET MVC 2 adds features to easily enable client and server validation logic, provides new strongly-typed HTML and UI-scaffolding helper methods.  It also enables more modular/reusable applications.  The new <%: %> syntax in ASP.NET makes it easier to HTML encode output.  Visual Studio 2010 also now includes better tooling support for unit testing and TDD.  In particular, “Consume first intellisense” and “generate from usage" support within VS 2010 make it easier to write your unit tests first, and then drive your implementation from them. Deploying ASP.NET applications gets a lot easier with this release. You can now publish your Websites and applications to a staging or production server from within Visual Studio itself. Visual Studio 2010 makes it easy to transfer all your files, code, configuration, database schema and data in one complete package. VS 2010 also makes it easy to manage separate web.config configuration files settings depending upon whether you are in debug, release, staging or production modes. WPF 4 and Silverlight 4 WPF 4 includes a ton of new improvements and capabilities including more built-in controls, richer graphics features (cached composition, pixel shader 3 support, layoutrounding, and animation easing functions), a much improved text stack (with crisper text rendering, custom dictionary support, and selection and caret brush options).  WPF 4 also includes a bunch of support to enable you to take advantage of new Windows 7 features – including multi-touch and Windows 7 shell integration. Silverlight 4 will launch this week as well.  You can watch my Silverlight 4 launch keynote streamed live Tuesday (April 13th) at 8am Pacific Time.  Silverlight 4 includes a ton of new capabilities – including a bunch for making it possible to build great business applications and out of the browser applications.  I’ll be doing a separate blog post later this week (once it is live on the web) that talks more about its capabilities. Visual Studio 2010 now includes great tooling support for both WPF and Silverlight.  The new VS 2010 WPF and Silverlight designer makes it much easier to build client applications as well as build great line of business solutions, as well as integrate and bind with data.  Tooling support for Silverlight 4 with the final release of Visual Studio 2010 will be available when Silverlight 4 releases to the web this week. SharePoint and Azure Visual Studio 2010 now includes built-in support for building SharePoint applications.  You can now create, edit, build, and debug SharePoint applications directly within Visual Studio 2010.  You can also now use SharePoint with TFS 2010. Support for creating Azure-hosted applications is also now included with VS 2010 – allowing you to build ASP.NET and WCF based applications and host them within the cloud. Data Access Data access has a lot of improvements coming to it with .NET 4.  Entity Framework 4 includes a ton of new features and capabilities – including support for model first and POCO development, default support for lazy loading, built-in support for pluralization/singularization of table/property names within the VS 2010 designer, full support for all the LINQ operators, the ability to optionally expose foreign keys on model objects (useful for some stateless web scenarios), disconnected API support to better handle N-Tier and stateless web scenarios, and T4 template customization support within VS 2010 to allow you to customize and automate how code is generated for you by the data designer.  In addition to improvements with the Entity Framework, LINQ to SQL with .NET 4 also includes a bunch of nice improvements.  WCF and Workflow WCF includes a bunch of great new capabilities – including better REST, activation and configuration support.  WCF Data Services (formerly known as Astoria) and WCF RIA Services also now enable you to easily expose and work with data from remote clients. Windows Workflow is now much faster, includes flowchart services, and now makes it easier to make custom services than before.  More details can be found here. CLR and Core .NET Library Improvements .NET 4 includes the new CLR 4 engine – which includes a lot of nice performance and feature improvements.  CLR 4 engine now runs side-by-side in-process with older versions of the CLR – allowing you to use two different versions of .NET within the same process.  It also includes improved COM interop support.  The .NET 4 base class libraries (BCL) include a bunch of nice additions and refinements.  In particular, the .NET 4 BCL now includes new parallel programming support that makes it much easier to build applications that take advantage of multiple CPUs and cores on a computer.  This work dove-tails nicely with the new VS 2010 parallel debugger (making it much easier to debug parallel applications), as well as the new F# functional language support now included in the VS 2010 IDE.  .NET 4 also now also has the Dynamic Language Runtime (DLR) library built-in – which makes it easier to use dynamic language functionality with .NET.  MEF – a really cool library that enables rich extensibility – is also now built-into .NET 4 and included as part of the base class libraries.  .NET 4 Client Profile The download size of the .NET 4 redist is now much smaller than it was before (the x86 full .NET 4 package is about 36MB).  We also now have a .NET 4 Client Profile package which is a pure sub-set of the full .NET that can be used to streamline client application installs. C++ VS 2010 includes a bunch of great improvements for C++ development.  This includes better C++ Intellisense support, MSBuild support for projects, improved parallel debugging and profiler support, MFC improvements, and a number of language features and compiler optimizations. My VS 2010 and .NET 4 Blog Series I’ve been cranking away on a blog series the last few months that highlights many of the new VS 2010 and .NET 4 improvements.  The good news is that I have about 20 in-depth posts already written.  The bad news (for me) is that I have about 200 more to go until I’m done!  I’m going to try and keep adding a few more each week over the next few months to discuss the new improvements and how best to take advantage of them. Below is a list of the already written ones that you can check out today: Clean Web.Config Files Starter Project Templates Multi-targeting Multiple Monitor Support New Code Focused Web Profile Option HTML / ASP.NET / JavaScript Code Snippets Auto-Start ASP.NET Applications URL Routing with ASP.NET 4 Web Forms Searching and Navigating Code in VS 2010 VS 2010 Code Intellisense Improvements WPF 4 Add Reference Dialog Improvements SEO Improvements with ASP.NET 4 Output Cache Extensibility with ASP.NET 4 Built-in Charting Controls for ASP.NET and Windows Forms Cleaner HTML Markup with ASP.NET 4 - Client IDs Optional Parameters and Named Arguments in C# 4 - and a cool scenarios with ASP.NET MVC 2 Automatic Properties, Collection Initializers and Implicit Line Continuation Support with VB 2010 New <%: %> Syntax for HTML Encoding Output using ASP.NET 4 JavaScript Intellisense Improvements with VS 2010 Stay tuned to my blog as I post more.  Also check out this page which links to a bunch of great articles and videos done by others. VS 2010 Installation Notes If you have installed a previous version of VS 2010 on your machine (either the beta or the RC) you must first uninstall it before installing the final VS 2010 release.  I also recommend uninstalling .NET 4 betas (including both the client and full .NET 4 installs) as well as the other installs that come with VS 2010 (e.g. ASP.NET MVC 2 preview builds, etc).  The uninstalls of the betas/RCs will clean up all the old state on your machine – after which you can install the final VS 2010 version and should have everything just work (this is what I’ve done on all of my machines and I haven’t had any problems). The VS 2010 and .NET 4 installs add a bunch of new managed assemblies to your machine.  Some of these will be “NGEN’d” to native code during the actual install process (making them run fast).  To avoid adding too much time to VS setup, though, we don’t NGEN all assemblies immediately – and instead will NGEN the rest in the background when your machine is idle.  Until it finishes NGENing the assemblies they will be JIT’d to native code the first time they are used in a process – which for large assemblies can sometimes cause a slight performance hit. If you run into this you can manually force all assemblies to be NGEN’d to native code immediately (and not just wait till the machine is idle) by launching the Visual Studio command line prompt from the Windows Start Menu (Microsoft Visual Studio 2010->Visual Studio Tools->Visual Studio Command Prompt).  Within the command prompt type “Ngen executequeueditems” – this will cause everything to be NGEN’d immediately. How to Buy Visual Studio 2010 You can can download and use the free Visual Studio express editions of Visual Web Developer 2010, Visual Basic 2010, Visual C# 2010 and Visual C++.  These express editions are available completely for free (and never time out). You can buy a new copy of VS 2010 Professional that includes a 1 year subscription to MSDN Essentials for $799.  MSDN Essentials includes a developer license of Windows 7 Ultimate, Windows Server 2008 R2 Enterprise, SQL Server 2008 DataCenter R2, and 20 hours of Azure hosting time.  Subscribers also have access to MSDN’s Online Concierge, and Priority Support in MSDN Forums. Upgrade prices from previous releases of Visual Studio are also available.  Existing Visual Studio 2005/2008 Standard customers can upgrade to Visual Studio 2010 Professional for a special $299 retail price until October.  You can take advantage of this VS Standard->Professional upgrade promotion here. Web developers who build applications for others, and who are either independent developers or who work for companies with less than 10 employees, can also optionally take advantage of the Microsoft WebSiteSpark program.  This program gives you three copies of Visual Studio 2010 Professional, 1 copy of Expression Studio, and 4 CPU licenses of both Windows 2008 R2 Web Server and SQL 2008 Web Edition that you can use to both develop and deploy applications with at no cost for 3 years.  At the end of the 3 years there is no obligation to buy anything.  You can sign-up for WebSiteSpark today in under 5 minutes – and immediately have access to the products to download. Summary Today’s release is a big one – and has a bunch of improvements for pretty much every developer.  Thank you everyone who provided feedback, suggestions and reported bugs throughout the development process – we couldn’t have delivered it without you.  Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • C# 4.0 RC, Silverlight 4.0 RC Covariance

    - by Ant
    Hi, I am trying to develop a Silverlight 4 application using C# 4.0. I have a case like this: public class Foo<T> : IEnumerable<T> { .... } Elsewhere: public class MyBaseType : MyInterface { ... } And the usage where I am having problems: Foo<MyBaseType> aBunchOfStuff = new Foo<MyBaseType>(); Foo<MyInterface> moreGeneralStuff = myListOFStuff; Now I believe this was impossible in C# 3.0 because generic type were "Invariant". However I thought this was possible in C# 4.0 through the new covariance for generics technology? As I understand it, in C# 4.0 a lot of common interfaces (like IEnumerable) have been modified to support variance. In this case does my Foo class need to anything special in order to become covariant? And is covariance supported in Silverlight 4 (RC) ?

    Read the article

  • computing matting Laplacian matrix of an Image

    - by ajith
    hi everyone, i need to compute Laplacian Matrix-L for an image(nXn) in opencv...computing goes as follows.......... dij - 1/|wk|{[1+1/(e/|wk|+s2)][(Ii-µk)*(Ij -µk)]}.... for all(i,j)?wk,summing over k yields (i,j)th element of L. where Here dij is the Kronecker delta,µk and s2k are the mean & variance of intensities in the window wk around k,and |wk| is the number of pixels in this window.wk is 3X3 window... here am not clear about 2 things... 1.what ll be the size of L?nXn or (nXn)X(nXn)?? 2.how to select Ii and Ij separately in 2D image?

    Read the article

  • Eta/Eta-squared routines in R

    - by aL3xa
    Apart from graphical estimation of linearity (gaze-at-scatterplot method), which is utilized before applying some technique from GLM family, there are several ways to do this estimation arithmetically (i.e. without graphs). Right now, I'll focus on Fisher's eta-squared - correlation ratio: arithmetically, it's equal to squared Pearson's r (coef. of determination: R2) if relationship between two variables is linear. Hence, you can compare values of eta and r and make an assessment about type of relation (linear or not). It provides an information about percent of variance in the dependent variable explained (linearly or not) by the independent variable. Therefore, you can apply it when linearity assumptions are not met. Simply stated: is there a routine for eta/eta-squared in R?

    Read the article

  • Choosing a plotting library for web/browser application

    - by Goro
    Hello, I am looking for a plotting/graphing library (mostly to do line plots) for my application. I have been looking at JavaScript APIs (like Google's) but I found them to be slowing down things at client side (I am plotting a quite large number of points). I also found that with client-side libraries, the performance was quite varied depending on the user's computer. With moving to a server-side library I would cut down on this variance, and would have more control over data flow (my data is in a MySQL database). I have then looked at some PHP-based plotting libraries, but a lot of them seem to be "forgotten" (no new version for years). I have been eying pChart, but it has not had an update in almost two years. First, what would you recommend: server-side or client-side approach? Second, what library would you recommend. Paid libraries are definitely an option, as I don't mind paying for quality software that would cut down on my development time. Thanks,

    Read the article

  • Findbugs warning: Equals method should not assume anything about the type of its argument

    - by Uri
    When running FindBugs on my project, I got a few instances of the error described above. Namely, my overriding versions of equals cast the RHS object into the same type as the object in which the overriding version is defined. However, I'm not sure whether a better design is possible, since AFAIK Java does not allow variance in method parameters, so it is not possible to define any other type for the equals parameter. Am I doing something very wrong, or is FindBugs too eager? A different way to phrase this question is: what is the correct behavior if the object passed to equals is not the same type as an LHS: Is this a false, or should there be an exception? For example: public boolean equals(Object rhs) { MyType rhsMyType = (MyType)rhs; // Should throw exception if(this.field1().equals(rhsMyType.field1())... // Or whatever }

    Read the article

  • ReportViewer Cell Formatting

    - by firedrawndagger
    I need to change the color of a cell in ReportViewer based on its value. 1) I got the first part down - I'm comparing the difference between two numbers like so: =IIf(Fields!variance.Value > 0) 2) Now how do I actually change the visual formatting of the cell? A web standards way would be to add a class - since then you can format the class to whatever you like, e.g. changing borders, typography, bacgkround color etc. Since ReportViewer sucks and doesn't have that functionality what should I use instead? Any hacks I could use?

    Read the article

  • How to generate graphs and statistics from SQLAlchemy tables [Python]?

    - by Az
    Hi all, After running a bunch of simulations I'm going to be outputting the results into a table created using SQLAlchemy. I plan to use this data to generatw statistics - mean and variance being key. These, in turn, will be used to generate some graphs - histograms/line graphs, pie-charts and box-and-whisker plots specifically. I'm aware of the Python graphing libraries like matplotlib. The thing is, I'm not sure how to have this integrate with the information contained within the database tables. Any suggestions on how to make these two play with each other? The main problem is that I'm not sure how to supply the information as "data sets" to the graphing library. Thanks in advance.

    Read the article

  • <function> referenced from; symbol(s) not found.

    - by jfm429
    I have a piece of C code that is used from a C++ function. At the top of my C++ file I have the line: #include "prediction.h" In prediction.h I have this: #ifndef prediction #define prediction #include "structs.h" typedef struct { double estimation; double variance; } response; response runPrediction(int obs, location* positions, double* observations, int targets, location* targetPositions); #endif I also have prediction.c, which has: #include "prediction.h" response runPrediction(int obs, location* positions, double* observations, int targets, location* targetPositions) { // code here } Now, in my C++ file (which as I said includes prediction.h) I call that function, then compile (through Xcode) I get this error: "runPrediction(int, location*, double*, int, location*)", referenced from: mainFrame::respondTo(char*, int)in mainFrame.o ld: symbol(s) not found collect2: ld returned 1 exit status prediction.c is marked for compilation for the current target. I don't have any problems with other .cpp files not being compiled. Any thoughts here?

    Read the article

  • What are the other new features of C# 4.0, after dynamic and optional parameters?

    - by Abel
    So, C# 4.0 came out yesterday. It introduced the much-debated dynamic keyword, named and optional parameters. Smaller improvements were the implicit ref and recognizing of indexed and default properties on COM methods, contra- and co-variance (really a .NET CLR feature, not C# only) and... Is that really it? Are dynamic and optional/named params the only real improvements to C#? Or did I miss something? Not that I'm complaining, but it seems a bit meager after C# 2.0 (generics) and C# 3.0 (lambda, LINQ). Maybe the language just reached actual maturity?

    Read the article

  • ArchBeat Link-o-Rama for 11/17/2011

    - by Bob Rhubart
    Building an Infrastructure Cloud with Oracle VM for x86 + Enterprise Manager 12c | Richard Rotter Richard Rotter demonstrates "how easy it could be to build a cloud infrastructure with Oracle's solution for cloud computing." Article: Social + Lean = Agile | Dave Duggal In today’s increasingly dynamic business environment, organizations must continuously adapt to survive. Change management has become a major bottleneck. Organizations’ need a practical mechanism for managing controlled variance and change in-flight to break the logjam. This paper provides a foundation for applying lean and agile principles to achieve Enterprise Agility through social collaboration. Stress Testing Java EE 6 Applications - Free Article In Free Java Magazine : Adam Bien "It is strange," says Adam Bien, "everyone is obsessed about green bars and code coverage, but testing of multi threaded behavior is widely ignored - until the applications run into massive problems." Using Access Manager to Secure Applications Deployed on WebLogic | Rene van Wijk Another great how-to post from Oracle ACE Rene van Wijk, this time involving JBoss RichFaces, Facelets, Oracle Coherence, and Oracle WebLogic Server. DOAG 2011 vs. Devoxx - Value and Attraction | Markus Eisele Oracle ACE Director Markus Eisele compares and contrasts these popular conferences with the aim of helping others decide which to attend. SOA All the Time; Architects in AZ; Clearing Info Integration hurdles SOA all the Time; Architects in AZ; Clearing Info Integration Hurdles This week on the Architect Home Page on OTN. Webcast: Oracle Business Intelligence Mobile Event Date: Wednesday, December 7, 2011 Time: 10 a.m. PT/1 p.m. ET Featuring Manan Goel (Director BI Product Marketing, Oracle) and Shailesh Shedge (Director BI and Analytics Practice, Ascentt). Webcast: Maximum Availability on Private Clouds A discussion of Oracle’s Maximum Availability Architecture, Oracle Database 11g, Oracle Exadata Database Machine, and Oracle Database appliance, featuring Margaret Hamburger (Director, Product Marketing, Oracle) and Joe Meeks (Director, Product Management, Oracle). November 30, 2011 at 10:00am PT / 1:00pm ET. Oracle Technology Network Architect Day - Phoenix, AZ Wednesday December 14, 2011, 8:30am - 5:00pm. The Ritz-Carlton, Phoenix, 2401 East Camelback Road, Phoenix, AZ 85016. Registration is free, but seating is limited.

    Read the article

  • What Precalculus knowledge is required before learning Discrete Math Computer Science topics?

    - by Ein Doofus
    Below I've listed the chapters from a Precalculus book as well as the author recommended Computer Science chapters from a Discrete Mathematics book. Although these chapters are from two specific books on these subjects I believe the topics are generally the same between any Precalc or Discrete Math book. What Precalculus topics should one know before starting these Discrete Math Computer Science topics?: Discrete Mathematics CS Chapters 1.1 Propositional Logic 1.2 Propositional Equivalences 1.3 Predicates and Quantifiers 1.4 Nested Quantifiers 1.5 Rules of Inference 1.6 Introduction to Proofs 1.7 Proof Methods and Strategy 2.1 Sets 2.2 Set Operations 2.3 Functions 2.4 Sequences and Summations 3.1 Algorithms 3.2 The Growths of Functions 3.3 Complexity of Algorithms 3.4 The Integers and Division 3.5 Primes and Greatest Common Divisors 3.6 Integers and Algorithms 3.8 Matrices 4.1 Mathematical Induction 4.2 Strong Induction and Well-Ordering 4.3 Recursive Definitions and Structural Induction 4.4 Recursive Algorithms 4.5 Program Correctness 5.1 The Basics of Counting 5.2 The Pigeonhole Principle 5.3 Permutations and Combinations 5.6 Generating Permutations and Combinations 6.1 An Introduction to Discrete Probability 6.4 Expected Value and Variance 7.1 Recurrence Relations 7.3 Divide-and-Conquer Algorithms and Recurrence Relations 7.5 Inclusion-Exclusion 8.1 Relations and Their Properties 8.2 n-ary Relations and Their Applications 8.3 Representing Relations 8.5 Equivalence Relations 9.1 Graphs and Graph Models 9.2 Graph Terminology and Special Types of Graphs 9.3 Representing Graphs and Graph Isomorphism 9.4 Connectivity 9.5 Euler and Hamilton Ptahs 10.1 Introduction to Trees 10.2 Application of Trees 10.3 Tree Traversal 11.1 Boolean Functions 11.2 Representing Boolean Functions 11.3 Logic Gates 11.4 Minimization of Circuits 12.1 Language and Grammars 12.2 Finite-State Machines with Output 12.3 Finite-State Machines with No Output 12.4 Language Recognition 12.5 Turing Machines Precalculus Chapters R.1 The Real-Number System R.2 Integer Exponents, Scientific Notation, and Order of Operations R.3 Addition, Subtraction, and Multiplication of Polynomials R.4 Factoring R.5 Rational Expressions R.6 Radical Notation and Rational Exponents R.7 The Basics of Equation Solving 1.1 Functions, Graphs, Graphers 1.2 Linear Functions, Slope, and Applications 1.3 Modeling: Data Analysis, Curve Fitting, and Linear Regression 1.4 More on Functions 1.5 Symmetry and Transformations 1.6 Variation and Applications 1.7 Distance, Midpoints, and Circles 2.1 Zeros of Linear Functions and Models 2.2 The Complex Numbers 2.3 Zeros of Quadratic Functions and Models 2.4 Analyzing Graphs of Quadratic Functions 2.5 Modeling: Data Analysis, Curve Fitting, and Quadratic Regression 2.6 Zeros and More Equation Solving 2.7 Solving Inequalities 3.1 Polynomial Functions and Modeling 3.2 Polynomial Division; The Remainder and Factor Theorems 3.3 Theorems about Zeros of Polynomial Functions 3.4 Rational Functions 3.5 Polynomial and Rational Inequalities 4.1 Composite and Inverse Functions 4.2 Exponential Functions and Graphs 4.3 Logarithmic Functions and Graphs 4.4 Properties of Logarithmic Functions 4.5 Solving Exponential and Logarithmic Equations 4.6 Applications and Models: Growth and Decay 5.1 Systems of Equations in Two Variables 5.2 System of Equations in Three Variables 5.3 Matrices and Systems of Equations 5.4 Matrix Operations 5.5 Inverses of Matrices 5.6 System of Inequalities and Linear Programming 5.7 Partial Fractions 6.1 The Parabola 6.2 The Circle and Ellipse 6.3 The Hyperbola 6.4 Nonlinear Systems of Equations

    Read the article

  • Will disabling hyperthreading improve performance on our SQL Server install

    - by Sam Saffron
    Related to: Current wisdom on SQL Server and Hyperthreading Recently we upgraded our Windows 2008 R2 database server from an X5470 to a X5560. The theory is both CPUs have very similar performance, if anything the X5560 is slightly faster. However, SQL Server 2008 R2 performance has been pretty bad over the last day or so and CPU usage has been pretty high. Page life expectancy is massive, we are getting almost 100% cache hit for the pages, so memory is not a problem. When I ran: SELECT * FROM sys.dm_os_wait_stats order by signal_wait_time_ms desc I got: wait_type waiting_tasks_count wait_time_ms max_wait_time_ms signal_wait_time_ms ------------------------------------------------------------ -------------------- -------------------- -------------------- -------------------- XE_TIMER_EVENT 115166 2799125790 30165 2799125065 REQUEST_FOR_DEADLOCK_SEARCH 559393 2799053973 5180 2799053973 SOS_SCHEDULER_YIELD 152289883 189948844 960 189756877 CXPACKET 234638389 2383701040 141334 118796827 SLEEP_TASK 170743505 1525669557 1406 76485386 LATCH_EX 97301008 810738519 1107 55093884 LOGMGR_QUEUE 16525384 2798527632 20751319 4083713 WRITELOG 16850119 18328365 1193 2367880 PAGELATCH_EX 13254618 8524515 11263 1670113 ASYNC_NETWORK_IO 23954146 6981220 7110 1475699 (10 row(s) affected) I also ran -- Isolate top waits for server instance since last restart or statistics clear WITH Waits AS ( SELECT wait_type, wait_time_ms / 1000. AS [wait_time_s], 100. * wait_time_ms / SUM(wait_time_ms) OVER() AS [pct], ROW_NUMBER() OVER(ORDER BY wait_time_ms DESC) AS [rn] FROM sys.dm_os_wait_stats WHERE wait_type NOT IN ('CLR_SEMAPHORE','LAZYWRITER_SLEEP','RESOURCE_QUEUE', 'SLEEP_TASK','SLEEP_SYSTEMTASK','SQLTRACE_BUFFER_FLUSH','WAITFOR','LOGMGR_QUEUE', 'CHECKPOINT_QUEUE','REQUEST_FOR_DEADLOCK_SEARCH','XE_TIMER_EVENT','BROKER_TO_FLUSH', 'BROKER_TASK_STOP','CLR_MANUAL_EVENT','CLR_AUTO_EVENT','DISPATCHER_QUEUE_SEMAPHORE', 'FT_IFTS_SCHEDULER_IDLE_WAIT','XE_DISPATCHER_WAIT', 'XE_DISPATCHER_JOIN')) SELECT W1.wait_type, CAST(W1.wait_time_s AS DECIMAL(12, 2)) AS wait_time_s, CAST(W1.pct AS DECIMAL(12, 2)) AS pct, CAST(SUM(W2.pct) AS DECIMAL(12, 2)) AS running_pct FROM Waits AS W1 INNER JOIN Waits AS W2 ON W2.rn <= W1.rn GROUP BY W1.rn, W1.wait_type, W1.wait_time_s, W1.pct HAVING SUM(W2.pct) - W1.pct < 95; -- percentage threshold And got wait_type wait_time_s pct running_pct CXPACKET 554821.66 65.82 65.82 LATCH_EX 184123.16 21.84 87.66 SOS_SCHEDULER_YIELD 37541.17 4.45 92.11 PAGEIOLATCH_SH 19018.53 2.26 94.37 FT_IFTSHC_MUTEX 14306.05 1.70 96.07 That shows huge amounts of time synchronizing queries involving parallelism (high CXPACKET). Additionally, anecdotally many of these problem queries are being executed on multiple cores (we have no MAXDOP hints anywhere in our code) The server has not been under load for more than a day or so. We are experiencing a large amount of variance with query executions, typically many queries appear to be slower that they were on our previous DB server and CPU is really high. Will disabling Hyperthreading help at reducing our CPU usage and increase throughput?

    Read the article

  • Web page layout becomes broken when moved to live.

    - by Moses
    I want to preface my question with the fact that I'm only a front-end web developer, so please excuse my gross lack of knowledge in this area. My company has three webservers: one for development (IIS v5), one for staging (IIS v5), and one for live deployment (IIS v6). Staging is an exact mirror of live. When I compare the staging and live web pages side by side in Firefox (3.6), the pages are identical. However, when I compare the staging and live pages with Internet Explorer (8), there are major differences... In staging, the squares for bulleted lists are small. In live, the squares are big. In staging, the borders for tables are thick. In live, the borders are thin. In staging, an ASP generated image is the proper height. In live, the image is cropped at the bottom by about 10px. In the end, the layout on live became broken because of these tiny differences, but why? Does the fact that live is on IIS 6 and staging is on IIS 5 account for the small variance in display? And is there any way I can change this server side? Also, is there any reason why Firefox displays both correctly and IE displays both incorrectly?

    Read the article

  • Steps to debugging web site latency and timeout issues

    - by Paperjam
    I have a client who has multiple offices around the country, all of which share the same Internet connection via their WAN. One specific office for this client is experiencing severe latency and timeout issues with my web site. Most, but not all, of the latency occurs on a specific ASPX page where multiple postbacks are made while populating cascading dropdown lists (rapid form submits). The latency is sporadic and can be anywhere from a few seconds to a full timeout. There is no indication that the timeouts are occurring on the server's end. The IT guy for this client is having trouble narrowing down the problem. Since it is affecting only one location for one client, I am led to believe it is not something with my site but something specific to that location. He's measured ping times while using the site and has noticed no real variance in ping times even when the page has timed out. I believe this may be being caused by some sort of Internet filter that doesn't like rapid form submits, but beyond a hunch I haven't a clue. My question is what things should I tell the IT guy to look for? While I'm not trying to provide active tech support for this issue, I would at least like to glean an understanding of what is going on and try to offer some sort of advice. Thank you.

    Read the article

  • Do We Indeed Have a Future? George Takei on Star Wars.

    - by Bil Simser
    George Takei (rhymes with Okay), probably best known for playing Hikaru Sulu on the original Star Trek, has always had deep concerns for the present and the future. Whether on Earth or among the stars, he has the welfare of humanity very much at heart. I was digging through my old copies of Famous Monsters of Filmland, a great publication on monster and films that I grew up with, and came across this. This was his reaction to STAR WARS from issue 139 of Famous Monsters of Filmland and was written June 6, 1977. It is reprinted here without permission but I hope since the message is still valid to this day and has never been reprinted anywhere, nobody will mind me sharing it. STAR WARS is the most pre-posterously diverting galactic escape and at the same time the most hideously credible portent of the future yet.While I thrilled to the exploits that reminded me of the heroics of Errol Flynn as Robin Hood, Burt Lancaster as the Crimson Pirate and Buster Crabbe as Flash Gordon, I was at the same time aghast at the phantasmagoric violence technology can place at our disposal. STAR WARS raised in my mind the question - do we indeed have a future?It seems to me what George Lucas has done is to masterfully guide us on a journey through space and time and bring us back face to face with today's reality. STAR WARS is more than science fiction, I think it is science fictitious reality.Just yesterday, June 7, 1977, I read that the United States will embark on the production of a neutron bomb - a bomb that will kill people on a gigantic scale but will not destroy buildings. A few days before that, I read that the Pentagon is fearful that the Soviets may have developed a warhead that could neutralize ours that have a capacity for that irrational concept overkill to the nth power. Already, it seems we have the technology to realize the awesome special effects simulations that we saw in the film.The political scene of STAR WARS is that of government by force and power, of revolutions based on some unfathomable grievance, survival through a combination of cunning and luck and success by the harnessing of technology -  a picture not very much at variance from the political headlines that we read today.And most of all, look at the people; both the heroes in the film and the reaction of the audience. First, the heroes; Luke Skywalker is a pretty but easily led youth. Without any real philosophy to guide him, he easily falls under the influence of a mystical old man believed previously to be an eccentric hermit. Recognize a 1960's hippie or a 1970's moonie? Han Solo has a philosophy coupled with courage and skill. His philosophy is money. His proficiency comes for a price - the highest. Solo is a thoroughly avaricious mercenary. And the Princess, a decisive, strong, self-confident and chilly woman. The audience cheered when she wielded a gun. In all three, I missed qualities that could be called humane - love, kindness, yes, I missed sensuality. I also missed a sense of ideals and faith. In this regard the machines seemed more human. They demonstrated real affection for each other and an occasional poutiness. They exhibited a sense of fidelity and constancy. The machines were humanized and the humans conversely seemed mechanical.As a member of the audience, I was swept up by the sheer romantic escapsim of it all. The deering-dos, the rope swing escape across the pit, the ray gun battles and especially the swash buckle with the ray swords. Great fun!But I just hope that we weren't too intoxicated by the escapism to be able to focus on the recognizable. I hope the beauty of the effects didn't narcotize our sensitivity to violence. I hope the people see through the fantastically well done futuristic mirrors to the disquieting reflection of our own society. I hope they enjoy STAR WARS without being "purely entertained".

    Read the article

< Previous Page | 1 2 3 4 5  | Next Page >