Search Results

Search found 2559 results on 103 pages for 'analysis'.

Page 75/103 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • A Taxonomy of Numerical Methods v1

    - by JoshReuben
    Numerical Analysis – When, What, (but not how) Once you understand the Math & know C++, Numerical Methods are basically blocks of iterative & conditional math code. I found the real trick was seeing the forest for the trees – knowing which method to use for which situation. Its pretty easy to get lost in the details – so I’ve tried to organize these methods in a way that I can quickly look this up. I’ve included links to detailed explanations and to C++ code examples. I’ve tried to classify Numerical methods in the following broad categories: Solving Systems of Linear Equations Solving Non-Linear Equations Iteratively Interpolation Curve Fitting Optimization Numerical Differentiation & Integration Solving ODEs Boundary Problems Solving EigenValue problems Enjoy – I did ! Solving Systems of Linear Equations Overview Solve sets of algebraic equations with x unknowns The set is commonly in matrix form Gauss-Jordan Elimination http://en.wikipedia.org/wiki/Gauss%E2%80%93Jordan_elimination C++: http://www.codekeep.net/snippets/623f1923-e03c-4636-8c92-c9dc7aa0d3c0.aspx Produces solution of the equations & the coefficient matrix Efficient, stable 2 steps: · Forward Elimination – matrix decomposition: reduce set to triangular form (0s below the diagonal) or row echelon form. If degenerate, then there is no solution · Backward Elimination –write the original matrix as the product of ints inverse matrix & its reduced row-echelon matrix à reduce set to row canonical form & use back-substitution to find the solution to the set Elementary ops for matrix decomposition: · Row multiplication · Row switching · Add multiples of rows to other rows Use pivoting to ensure rows are ordered for achieving triangular form LU Decomposition http://en.wikipedia.org/wiki/LU_decomposition C++: http://ganeshtiwaridotcomdotnp.blogspot.co.il/2009/12/c-c-code-lu-decomposition-for-solving.html Represent the matrix as a product of lower & upper triangular matrices A modified version of GJ Elimination Advantage – can easily apply forward & backward elimination to solve triangular matrices Techniques: · Doolittle Method – sets the L matrix diagonal to unity · Crout Method - sets the U matrix diagonal to unity Note: both the L & U matrices share the same unity diagonal & can be stored compactly in the same matrix Gauss-Seidel Iteration http://en.wikipedia.org/wiki/Gauss%E2%80%93Seidel_method C++: http://www.nr.com/forum/showthread.php?t=722 Transform the linear set of equations into a single equation & then use numerical integration (as integration formulas have Sums, it is implemented iteratively). an optimization of Gauss-Jacobi: 1.5 times faster, requires 0.25 iterations to achieve the same tolerance Solving Non-Linear Equations Iteratively find roots of polynomials – there may be 0, 1 or n solutions for an n order polynomial use iterative techniques Iterative methods · used when there are no known analytical techniques · Requires set functions to be continuous & differentiable · Requires an initial seed value – choice is critical to convergence à conduct multiple runs with different starting points & then select best result · Systematic - iterate until diminishing returns, tolerance or max iteration conditions are met · bracketing techniques will always yield convergent solutions, non-bracketing methods may fail to converge Incremental method if a nonlinear function has opposite signs at 2 ends of a small interval x1 & x2, then there is likely to be a solution in their interval – solutions are detected by evaluating a function over interval steps, for a change in sign, adjusting the step size dynamically. Limitations – can miss closely spaced solutions in large intervals, cannot detect degenerate (coinciding) solutions, limited to functions that cross the x-axis, gives false positives for singularities Fixed point method http://en.wikipedia.org/wiki/Fixed-point_iteration C++: http://books.google.co.il/books?id=weYj75E_t6MC&pg=PA79&lpg=PA79&dq=fixed+point+method++c%2B%2B&source=bl&ots=LQ-5P_taoC&sig=lENUUIYBK53tZtTwNfHLy5PEWDk&hl=en&sa=X&ei=wezDUPW1J5DptQaMsIHQCw&redir_esc=y#v=onepage&q=fixed%20point%20method%20%20c%2B%2B&f=false Algebraically rearrange a solution to isolate a variable then apply incremental method Bisection method http://en.wikipedia.org/wiki/Bisection_method C++: http://numericalcomputing.wordpress.com/category/algorithms/ Bracketed - Select an initial interval, keep bisecting it ad midpoint into sub-intervals and then apply incremental method on smaller & smaller intervals – zoom in Adv: unaffected by function gradient à reliable Disadv: slow convergence False Position Method http://en.wikipedia.org/wiki/False_position_method C++: http://www.dreamincode.net/forums/topic/126100-bisection-and-false-position-methods/ Bracketed - Select an initial interval , & use the relative value of function at interval end points to select next sub-intervals (estimate how far between the end points the solution might be & subdivide based on this) Newton-Raphson method http://en.wikipedia.org/wiki/Newton's_method C++: http://www-users.cselabs.umn.edu/classes/Summer-2012/csci1113/index.php?page=./newt3 Also known as Newton's method Convenient, efficient Not bracketed – only a single initial guess is required to start iteration – requires an analytical expression for the first derivative of the function as input. Evaluates the function & its derivative at each step. Can be extended to the Newton MutiRoot method for solving multiple roots Can be easily applied to an of n-coupled set of non-linear equations – conduct a Taylor Series expansion of a function, dropping terms of order n, rewrite as a Jacobian matrix of PDs & convert to simultaneous linear equations !!! Secant Method http://en.wikipedia.org/wiki/Secant_method C++: http://forum.vcoderz.com/showthread.php?p=205230 Unlike N-R, can estimate first derivative from an initial interval (does not require root to be bracketed) instead of inputting it Since derivative is approximated, may converge slower. Is fast in practice as it does not have to evaluate the derivative at each step. Similar implementation to False Positive method Birge-Vieta Method http://mat.iitm.ac.in/home/sryedida/public_html/caimna/transcendental/polynomial%20methods/bv%20method.html C++: http://books.google.co.il/books?id=cL1boM2uyQwC&pg=SA3-PA51&lpg=SA3-PA51&dq=Birge-Vieta+Method+c%2B%2B&source=bl&ots=QZmnDTK3rC&sig=BPNcHHbpR_DKVoZXrLi4nVXD-gg&hl=en&sa=X&ei=R-_DUK2iNIjzsgbE5ID4Dg&redir_esc=y#v=onepage&q=Birge-Vieta%20Method%20c%2B%2B&f=false combines Horner's method of polynomial evaluation (transforming into lesser degree polynomials that are more computationally efficient to process) with Newton-Raphson to provide a computational speed-up Interpolation Overview Construct new data points for as close as possible fit within range of a discrete set of known points (that were obtained via sampling, experimentation) Use Taylor Series Expansion of a function f(x) around a specific value for x Linear Interpolation http://en.wikipedia.org/wiki/Linear_interpolation C++: http://www.hamaluik.com/?p=289 Straight line between 2 points à concatenate interpolants between each pair of data points Bilinear Interpolation http://en.wikipedia.org/wiki/Bilinear_interpolation C++: http://supercomputingblog.com/graphics/coding-bilinear-interpolation/2/ Extension of the linear function for interpolating functions of 2 variables – perform linear interpolation first in 1 direction, then in another. Used in image processing – e.g. texture mapping filter. Uses 4 vertices to interpolate a value within a unit cell. Lagrange Interpolation http://en.wikipedia.org/wiki/Lagrange_polynomial C++: http://www.codecogs.com/code/maths/approximation/interpolation/lagrange.php For polynomials Requires recomputation for all terms for each distinct x value – can only be applied for small number of nodes Numerically unstable Barycentric Interpolation http://epubs.siam.org/doi/pdf/10.1137/S0036144502417715 C++: http://www.gamedev.net/topic/621445-barycentric-coordinates-c-code-check/ Rearrange the terms in the equation of the Legrange interpolation by defining weight functions that are independent of the interpolated value of x Newton Divided Difference Interpolation http://en.wikipedia.org/wiki/Newton_polynomial C++: http://jee-appy.blogspot.co.il/2011/12/newton-divided-difference-interpolation.html Hermite Divided Differences: Interpolation polynomial approximation for a given set of data points in the NR form - divided differences are used to approximately calculate the various differences. For a given set of 3 data points , fit a quadratic interpolant through the data Bracketed functions allow Newton divided differences to be calculated recursively Difference table Cubic Spline Interpolation http://en.wikipedia.org/wiki/Spline_interpolation C++: https://www.marcusbannerman.co.uk/index.php/home/latestarticles/42-articles/96-cubic-spline-class.html Spline is a piecewise polynomial Provides smoothness – for interpolations with significantly varying data Use weighted coefficients to bend the function to be smooth & its 1st & 2nd derivatives are continuous through the edge points in the interval Curve Fitting A generalization of interpolating whereby given data points may contain noise à the curve does not necessarily pass through all the points Least Squares Fit http://en.wikipedia.org/wiki/Least_squares C++: http://www.ccas.ru/mmes/educat/lab04k/02/least-squares.c Residual – difference between observed value & expected value Model function is often chosen as a linear combination of the specified functions Determines: A) The model instance in which the sum of squared residuals has the least value B) param values for which model best fits data Straight Line Fit Linear correlation between independent variable and dependent variable Linear Regression http://en.wikipedia.org/wiki/Linear_regression C++: http://www.oocities.org/david_swaim/cpp/linregc.htm Special case of statistically exact extrapolation Leverage least squares Given a basis function, the sum of the residuals is determined and the corresponding gradient equation is expressed as a set of normal linear equations in matrix form that can be solved (e.g. using LU Decomposition) Can be weighted - Drop the assumption that all errors have the same significance –-> confidence of accuracy is different for each data point. Fit the function closer to points with higher weights Polynomial Fit - use a polynomial basis function Moving Average http://en.wikipedia.org/wiki/Moving_average C++: http://www.codeproject.com/Articles/17860/A-Simple-Moving-Average-Algorithm Used for smoothing (cancel fluctuations to highlight longer-term trends & cycles), time series data analysis, signal processing filters Replace each data point with average of neighbors. Can be simple (SMA), weighted (WMA), exponential (EMA). Lags behind latest data points – extra weight can be given to more recent data points. Weights can decrease arithmetically or exponentially according to distance from point. Parameters: smoothing factor, period, weight basis Optimization Overview Given function with multiple variables, find Min (or max by minimizing –f(x)) Iterative approach Efficient, but not necessarily reliable Conditions: noisy data, constraints, non-linear models Detection via sign of first derivative - Derivative of saddle points will be 0 Local minima Bisection method Similar method for finding a root for a non-linear equation Start with an interval that contains a minimum Golden Search method http://en.wikipedia.org/wiki/Golden_section_search C++: http://www.codecogs.com/code/maths/optimization/golden.php Bisect intervals according to golden ratio 0.618.. Achieves reduction by evaluating a single function instead of 2 Newton-Raphson Method Brent method http://en.wikipedia.org/wiki/Brent's_method C++: http://people.sc.fsu.edu/~jburkardt/cpp_src/brent/brent.cpp Based on quadratic or parabolic interpolation – if the function is smooth & parabolic near to the minimum, then a parabola fitted through any 3 points should approximate the minima – fails when the 3 points are collinear , in which case the denominator is 0 Simplex Method http://en.wikipedia.org/wiki/Simplex_algorithm C++: http://www.codeguru.com/cpp/article.php/c17505/Simplex-Optimization-Algorithm-and-Implemetation-in-C-Programming.htm Find the global minima of any multi-variable function Direct search – no derivatives required At each step it maintains a non-degenerative simplex – a convex hull of n+1 vertices. Obtains the minimum for a function with n variables by evaluating the function at n-1 points, iteratively replacing the point of worst result with the point of best result, shrinking the multidimensional simplex around the best point. Point replacement involves expanding & contracting the simplex near the worst value point to determine a better replacement point Oscillation can be avoided by choosing the 2nd worst result Restart if it gets stuck Parameters: contraction & expansion factors Simulated Annealing http://en.wikipedia.org/wiki/Simulated_annealing C++: http://code.google.com/p/cppsimulatedannealing/ Analogy to heating & cooling metal to strengthen its structure Stochastic method – apply random permutation search for global minima - Avoid entrapment in local minima via hill climbing Heating schedule - Annealing schedule params: temperature, iterations at each temp, temperature delta Cooling schedule – can be linear, step-wise or exponential Differential Evolution http://en.wikipedia.org/wiki/Differential_evolution C++: http://www.amichel.com/de/doc/html/ More advanced stochastic methods analogous to biological processes: Genetic algorithms, evolution strategies Parallel direct search method against multiple discrete or continuous variables Initial population of variable vectors chosen randomly – if weighted difference vector of 2 vectors yields a lower objective function value then it replaces the comparison vector Many params: #parents, #variables, step size, crossover constant etc Convergence is slow – many more function evaluations than simulated annealing Numerical Differentiation Overview 2 approaches to finite difference methods: · A) approximate function via polynomial interpolation then differentiate · B) Taylor series approximation – additionally provides error estimate Finite Difference methods http://en.wikipedia.org/wiki/Finite_difference_method C++: http://www.wpi.edu/Pubs/ETD/Available/etd-051807-164436/unrestricted/EAMPADU.pdf Find differences between high order derivative values - Approximate differential equations by finite differences at evenly spaced data points Based on forward & backward Taylor series expansion of f(x) about x plus or minus multiples of delta h. Forward / backward difference - the sums of the series contains even derivatives and the difference of the series contains odd derivatives – coupled equations that can be solved. Provide an approximation of the derivative within a O(h^2) accuracy There is also central difference & extended central difference which has a O(h^4) accuracy Richardson Extrapolation http://en.wikipedia.org/wiki/Richardson_extrapolation C++: http://mathscoding.blogspot.co.il/2012/02/introduction-richardson-extrapolation.html A sequence acceleration method applied to finite differences Fast convergence, high accuracy O(h^4) Derivatives via Interpolation Cannot apply Finite Difference method to discrete data points at uneven intervals – so need to approximate the derivative of f(x) using the derivative of the interpolant via 3 point Lagrange Interpolation Note: the higher the order of the derivative, the lower the approximation precision Numerical Integration Estimate finite & infinite integrals of functions More accurate procedure than numerical differentiation Use when it is not possible to obtain an integral of a function analytically or when the function is not given, only the data points are Newton Cotes Methods http://en.wikipedia.org/wiki/Newton%E2%80%93Cotes_formulas C++: http://www.siafoo.net/snippet/324 For equally spaced data points Computationally easy – based on local interpolation of n rectangular strip areas that is piecewise fitted to a polynomial to get the sum total area Evaluate the integrand at n+1 evenly spaced points – approximate definite integral by Sum Weights are derived from Lagrange Basis polynomials Leverage Trapezoidal Rule for default 2nd formulas, Simpson 1/3 Rule for substituting 3 point formulas, Simpson 3/8 Rule for 4 point formulas. For 4 point formulas use Bodes Rule. Higher orders obtain more accurate results Trapezoidal Rule uses simple area, Simpsons Rule replaces the integrand f(x) with a quadratic polynomial p(x) that uses the same values as f(x) for its end points, but adds a midpoint Romberg Integration http://en.wikipedia.org/wiki/Romberg's_method C++: http://code.google.com/p/romberg-integration/downloads/detail?name=romberg.cpp&can=2&q= Combines trapezoidal rule with Richardson Extrapolation Evaluates the integrand at equally spaced points The integrand must have continuous derivatives Each R(n,m) extrapolation uses a higher order integrand polynomial replacement rule (zeroth starts with trapezoidal) à a lower triangular matrix set of equation coefficients where the bottom right term has the most accurate approximation. The process continues until the difference between 2 successive diagonal terms becomes sufficiently small. Gaussian Quadrature http://en.wikipedia.org/wiki/Gaussian_quadrature C++: http://www.alglib.net/integration/gaussianquadratures.php Data points are chosen to yield best possible accuracy – requires fewer evaluations Ability to handle singularities, functions that are difficult to evaluate The integrand can include a weighting function determined by a set of orthogonal polynomials. Points & weights are selected so that the integrand yields the exact integral if f(x) is a polynomial of degree <= 2n+1 Techniques (basically different weighting functions): · Gauss-Legendre Integration w(x)=1 · Gauss-Laguerre Integration w(x)=e^-x · Gauss-Hermite Integration w(x)=e^-x^2 · Gauss-Chebyshev Integration w(x)= 1 / Sqrt(1-x^2) Solving ODEs Use when high order differential equations cannot be solved analytically Evaluated under boundary conditions RK for systems – a high order differential equation can always be transformed into a coupled first order system of equations Euler method http://en.wikipedia.org/wiki/Euler_method C++: http://rosettacode.org/wiki/Euler_method First order Runge–Kutta method. Simple recursive method – given an initial value, calculate derivative deltas. Unstable & not very accurate (O(h) error) – not used in practice A first-order method - the local error (truncation error per step) is proportional to the square of the step size, and the global error (error at a given time) is proportional to the step size In evolving solution between data points xn & xn+1, only evaluates derivatives at beginning of interval xn à asymmetric at boundaries Higher order Runge Kutta http://en.wikipedia.org/wiki/Runge%E2%80%93Kutta_methods C++: http://www.dreamincode.net/code/snippet1441.htm 2nd & 4th order RK - Introduces parameterized midpoints for more symmetric solutions à accuracy at higher computational cost Adaptive RK – RK-Fehlberg – estimate the truncation at each integration step & automatically adjust the step size to keep error within prescribed limits. At each step 2 approximations are compared – if in disagreement to a specific accuracy, the step size is reduced Boundary Value Problems Where solution of differential equations are located at 2 different values of the independent variable x à more difficult, because cannot just start at point of initial value – there may not be enough starting conditions available at the end points to produce a unique solution An n-order equation will require n boundary conditions – need to determine the missing n-1 conditions which cause the given conditions at the other boundary to be satisfied Shooting Method http://en.wikipedia.org/wiki/Shooting_method C++: http://ganeshtiwaridotcomdotnp.blogspot.co.il/2009/12/c-c-code-shooting-method-for-solving.html Iteratively guess the missing values for one end & integrate, then inspect the discrepancy with the boundary values of the other end to adjust the estimate Given the starting boundary values u1 & u2 which contain the root u, solve u given the false position method (solving the differential equation as an initial value problem via 4th order RK), then use u to solve the differential equations. Finite Difference Method For linear & non-linear systems Higher order derivatives require more computational steps – some combinations for boundary conditions may not work though Improve the accuracy by increasing the number of mesh points Solving EigenValue Problems An eigenvalue can substitute a matrix when doing matrix multiplication à convert matrix multiplication into a polynomial EigenValue For a given set of equations in matrix form, determine what are the solution eigenvalue & eigenvectors Similar Matrices - have same eigenvalues. Use orthogonal similarity transforms to reduce a matrix to diagonal form from which eigenvalue(s) & eigenvectors can be computed iteratively Jacobi method http://en.wikipedia.org/wiki/Jacobi_method C++: http://people.sc.fsu.edu/~jburkardt/classes/acs2_2008/openmp/jacobi/jacobi.html Robust but Computationally intense – use for small matrices < 10x10 Power Iteration http://en.wikipedia.org/wiki/Power_iteration For any given real symmetric matrix, generate the largest single eigenvalue & its eigenvectors Simplest method – does not compute matrix decomposition à suitable for large, sparse matrices Inverse Iteration Variation of power iteration method – generates the smallest eigenvalue from the inverse matrix Rayleigh Method http://en.wikipedia.org/wiki/Rayleigh's_method_of_dimensional_analysis Variation of power iteration method Rayleigh Quotient Method Variation of inverse iteration method Matrix Tri-diagonalization Method Use householder algorithm to reduce an NxN symmetric matrix to a tridiagonal real symmetric matrix vua N-2 orthogonal transforms     Whats Next Outside of Numerical Methods there are lots of different types of algorithms that I’ve learned over the decades: Data Mining – (I covered this briefly in a previous post: http://geekswithblogs.net/JoshReuben/archive/2007/12/31/ssas-dm-algorithms.aspx ) Search & Sort Routing Problem Solving Logical Theorem Proving Planning Probabilistic Reasoning Machine Learning Solvers (eg MIP) Bioinformatics (Sequence Alignment, Protein Folding) Quant Finance (I read Wilmott’s books – interesting) Sooner or later, I’ll cover the above topics as well.

    Read the article

  • Are you cashing in on the MVP complimentary subscriptions ?

    - by Tarun Arora
    The two most asked questions in the Microsoft technology communities around the Microsoft MVP program are, 1. How do I become a Microsoft MVP? 2. What benefits do I get as an MVP? The answer to the first question has been well answered here. In this blog post, I’ll try and answer the second question.           Please find a comprehensive list of Not for Resale personal subscriptions of various products that Microsoft MVP’s are eligible for Product Description Details JetBrains Resharper, dotTrace, dotCover & WebStorm  https://www.jetbrains.com/resharper/buy/mvp.html RedGate Sql server development, database administration, .net development, azure development (merged with Cerebrata), mySQL development, Oracle development http://www.red-gate.com/community/mvp-program Pluralsight Pluralsight on demand training http://blog.pluralsight.com/2011/02/28/pluralsight-for-mvp/ Cerebrata Cloud storage studio and Azure Diagnostic Manager (part of redgate now) https://www.cerebrata.com/Offers/mvp.aspx Telerik Telerik Ultimate collection & Telerik TeamPulse http://blogs.telerik.com/blogs/posts/11-03-01/telerik-gift-for-microsoft-mvps.aspx Developer Express DevEx controls http://www.devexpress.com/Home/Community/mvp.xml InnerWorking 600 hours of .net training catalogue http://www.innerworkings.com/mvp Typemock Typemock Isolator, Typemock Isolator for Sharepoint developers, Typemock Isolator for web developers, TestDriven.NET http://www.typemock.com/mvp SpeakFlow A suite of tools for creating, managing, and delivering non-linear presentations http://www.speakflow.com/ TechSmith Camtasia Studio, SnagIt, screen cast http://www.techsmith.com/camtasia.html Altova Altova XML spy http://www.altova.com/xml-editor/ Visual SVN VisualSVN Subversion integration plug-in for Visual Studio http://www.visualsvn.com/visualsvn/purchase/mvp/ PreEmptive Solution Professional PreEmptive Analytics, Dotfuscator http://www.preemptive.com/landing/mvp Armadillo Armadillo Adaptive Bug Prevention http://www.armadilloverdrive.com/ IS Decisions NFR license to Userlock, RemoteExec, FileAudit & WinReporter http://www.isdecisions.com/download/mvp-mct-program.htm Idera SQL tools http://www.idera.com/Content/Home.aspx West Wind Help Builder Help builder solution http://www.west-wind.com/weblog/posts/2005/Mar/09/Are-you-a-Microsoft-MVP-Get-a-FREE-copy-of-West-Wind-Html-Help-Builder Bamboo Sharepoint tools http://community.bamboosolutions.com/blogs/partner-advantage-program/archive/2008/08/01/partner-advantage-program-mvp.aspx Nitriq Nitriq code analysis http://blog.nitriq.com/FreeLicensesForMicrosoftMVPs.aspx ByteScout Components, Libraries and Developer Tools http://bytescout.com/buy/purchase_nfr_for_mvp.html YourKit Java and .net Profiler http://yourkit.com/.net/profiler/index.jsp Aspose .NET components http://www.aspose.com/corporate/community/2012_05_08_nfr-licenses-for-community-leaders.aspx Apart from google bing fu; stackoverflow and breathtech were a great help in compiling the above list. If you know of any other benefits, offers or complimentary subscriptions on offer for MVPs not cover in the list above, please add to the comment thread and I’ll have it updated in the list. Enjoy

    Read the article

  • Don&rsquo;t Forget! In-Memory Databases are Hot

    - by andrewbrust
    If you’re left scratching your head over SAP’s intention to acquire Sybase for almost $6 million, you’re not alone.  Despite Sybase’s 1990s reign as the supreme database standard in certain sectors (including Wall Street), the company’s flagship product has certainly fallen from grace.  Why would SAP pay a greater than 50% premium over Sybase’s closing price on the day of the announcement just to acquire a relational database which is firmly stuck in maintenance mode? Well there’s more to Sybase than the relational database product.  Take, for example, its mobile application platform.  It hit Gartner’s “Leaders’ Quadrant” in January of last year, and SAP needs a good mobile play.  Beyond the platform itself, Sybase has a slew of mobile services; click this link to look them over. There’s a second major asset that Sybase has though, and I wonder if it figured prominently into SAP’s bid: Sybase IQ.  Sybase IQ is a columnar database.  Columnar databases place values from a given database column contiguously, unlike conventional relational databases, which store all of a row’s data in close proximity.  Storing column values together works well in aggregation reporting scenarios, because the figures to be aggregated can be scanned in one efficient step.  It also makes for high rates of compression because values from a single column tend to be close to each other in magnitude and may contain long sequences of repeating values.  Highly compressible databases use much less disk storage and can be largely or wholly loaded into memory, resulting in lighting fast query performance.  For an ERP company like SAP, with its own legacy BI platform (SAP BW) and the entire range of Business Objects and Crystal Reports BI products (which it acquired in 2007) query performance is extremely important. And it’s a competitive necessity too.  QlikTech has built an entire company on a columnar, in-memory BI product (QlikView).  So too has startup company Vertica.  IBM’s TM1 product has been doing in-memory OLAP for years.  And guess who else has the in-memory religion?  Microsoft does, in the form of its new PowerPivot product.  I expect the technology in PowerPivot to become strategic to the full-blown SQL Server Analysis Services product and the entire Microsoft BI stack.  I sure don’t blame SAP for jumping on the in-memory bandwagon, if indeed the Sybase acquisition is, at least in part, motivated by that. It will be interesting to watch and see what SAP does with Sybase’s product line-up (assuming the acquisition closes), including the core database, the mobile platform, IQ, and even tools like PowerBuilder.  It is also fascinating to watch columnar’s encroachment on relational.  Perhaps this acquisition will be columnar’s tipping point and people will no longer see it as a fad.  Are you listening Larry Ellison?

    Read the article

  • What is Inversion of control and why we need it?

    - by Jalpesh P. Vadgama
    Most of programmer need inversion of control pattern in today’s complex real time application world. So I have decided to write a blog post about it. This blog post will explain what is Inversion of control and why we need it. We are going to take a real world example so it would be better to understand. The problem- Why we need inversion of control? Before giving definition of Inversion of control let’s take a simple real word example to see why we need inversion of control. Please have look on the following code. public class class1 { private class2 _class2; public class1() { _class2=new class2(); } } public class class2 { //Some implementation of class2 } I have two classes “Class1” and “Class2”.  If you see the code in that I have created a instance of class2 class in the class1 class constructor. So the “class1” class is dependent on “class2”. I think that is the biggest issue in real world scenario as if we change the “class2” class then we might need to change the “class1” class also. Here there is one type of dependency between this two classes that is called Tight Coupling. Tight coupling will have lots of problem in real world applications as things are tends to be change in future so we have to change all the tight couple classes that are dependent of each other. To avoid this kind of issue we need Inversion of control. What is Inversion of Control? According to the wikipedia following is a definition of Inversion of control. “In software engineering, Inversion of Control (IoC) is an object-oriented programming practice where the object coupling is bound at run time by an assembler object and is typically not known at compile time using static analysis.” So if you read the it carefully it says that we should have object coupling at run time not compile time where it know what object it will create, what method it will call or what feature it will going to use for that. We need to use same classes in such way so that it will not tight couple with each other. There are multiple way to implement Inversion of control. You can refer wikipedia link for knowing multiple ways of implementing Inversion of control. In future posts we are going to see all the different way of implementing Inversion of control.

    Read the article

  • Software Architecture and Software Architecture Evaluation

    How many of us have worked at places where the concept of software architecture was ridiculed for wasting time and money? Even more ridiculous to them was the concept of evaluating software architecture. I think the next time that I am in this situation again, and I hope that I never am I will have to push for this methodology in the software development life cycle. I have spent way too many hours/days/months/years working poorly architected systems or systems that were just built ADHOC. This in software development must stop. I can understand why systems get like this due to overzealous sales staff, demanding management that wants everything yesterday, and project managers asking if things are done yet before the project has even started. But seriously, some time must be spent designing the applications that we write along with evaluating the architecture so that it will integrate will within the existing systems of an origination. If placed in this situation again, I will strive to gain buying from key players within the business, for example: Senior Software Engineers\Developers, Software Architects, Project Managers, Software Quality Assurance, Technical Services, Operations, and Finance in order for this idea to succeed with upper management. In order to convince these key players I will have to show them the benefits of architecture and even more benefits of evaluating software architecture on a system wide level. Benefits of Software Architecture Evaluation Places Stakeholders in the Same Room to Communicate Ensures Delivery of Detailed Quality Goals Prioritizes Conflicting Goals Requires Clear Explication Improves the Quality of Documentation Discovers Opportunities for Cross-Project Reuse Improves Architecture Practices Once I had key player buy in then and only then would I approach upper management about my plan regarding implementing the concept of software architecture and using evaluation to ensure that the software being designed is the proper architecture for the project. In addition to the benefits listed above I would also show upper management how much time is being wasted by not doing these evaluations. For example, if project X cost us Y amount, then why do we have several implementations in various forms of X and how much money and time could we have saved if we just reused the existing code base to give each system the same functionality that was already created? After this, I would mention what would happen if we had 50 instances of this situation? Then I would show them how the software architecture evaluation process would have prevented this and that the optimization could have leveraged its existing code base to increase the speed and quality of its development. References:Carnegie Mellon Software Engineering Institute (2011). Architecture Tradeoff Analysis Method from http://www.sei.cmu.edu/architecture/tools/evaluate/atam.cfm

    Read the article

  • Administering Team Foundation Server 2010 Class resource links

    - by John Alexander
    Here are the resource links for the Administering Team Foundation Server 2010 Class from last week in Minneapolis.  Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Microsoft® Virtual PC 2007 SP1 http://www.microsoft.com/downloads/en/details.aspx?FamilyID=5e13b15a-fd74-4cd7-b53e-bdf9456855bd Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Windows Virtual PC http://www.microsoft.com/downloads/en/details.aspx?FamilyID=509c3ba1-4efc-42b5-b6d8-0232b2cbb26e Microsoft® Visual Studio® 2010 and Team Foundation Server® 2010 RTM virtual machine for Windows Server 2008 Hyper-V http://www.microsoft.com/downloads/en/details.aspx?FamilyID=e0198b64-4acb-4709-b07f-359fb4d523bc Customizable process guidance http://blogs.msdn.com/b/allclark/archive/2010/08/12/customizable-process-guidance.aspx The 5 most read Visual Studio ALM help topics on MSDN http://blogs.msdn.com/b/allclark/archive/2010/11/12/the-5-most-read-visual-studio-alm-help-topics-on-msdn.aspx Inside TFS http://visualstudiomagazine.com/Articles/List/Inside-TFS.aspx Testing Topics http://msdn.microsoft.com/en-us/library/dd286594.aspx Blogs http://community.accentient.com http://geekswithblogs.net Branching Guide http://tfsbranchingguideiii.codeplex.com/ Great VSTS blog http://geekswithblogs.net/hinshelm/Default.aspx My Blog :D http://geekswithblogs.net/jalexander/Default.aspx Visual Studio Forums http://bit.ly/fE16u3 TFS Migration and Integration Solutions http://bit.ly/cLaBnT TFS Migration and Integration Tools (VS ALM Rangers) http://bit.ly/9tHWdG TFS Migration and Integration Platform (CodePlex) http://tfsintegration.codeplex.com Team Foundation Server SDK http://code.msdn.microsoft.com/TfsSdk Migrate and Integration Forum http://bit.ly/f4Lnps Team Foundation Server Widgets http://www.tfswidgets.com TFS Sdk http://code.msdn.microsoft.com/TfsSdk TFS Migration and Integration Solutions http://bit.ly/cLaBnT TFS Integration Tools Forum http://bit.ly/f4Lnps TFS Integration Tools http://bit.ly/9tHWdG TFS Integration Platform http://tfsintegration.codeplex.com VS Upgrade Guide http://vs2010upgradeguide.codeplex.com Updating an Upgraded Team Project to Access New Features http://bit.ly/9cCcMP Team Foundation Power Tools http://bit.ly/dfNVQk Team Foundation Administration Tool http://tfsadmin.codeplex.com Using Team Foundation Server Command-Line Tools http://bit.ly/hCyozJ Changing Groups and Permissions with TFSSecurity http://bit.ly/esIjgw Unofficial Prep guide for TFS 2010 Administration Exam (70-512) http://geekswithblogs.net/enriquelima/archive/2010/07/21/unofficial-prep-guide-for-tfs-2010-administration-exam-70-512.aspx Another Prep Guide http://bit.ly/bpO30R Professional Application Lifecycle Management with VS 2010 Book http://bit.ly/9rCIRj Search CodePlex for TFS related apps http://www.codeplex.com/site/search Visual Studio Gallery http://visualstudiogallery.com TFS Widgets http://tfswidgets.com Migrate from Visual SourceSafe http://bit.ly/8XPSRh Team Foundation Server MSSCCI Provider 2010 http://bit.ly/dst1OQ Attrice TFS Sidekicks www.attrice.info/cm/tfs Hosted TFS http://bit.ly/cMZdvp Manually Processing the Team Foundation Server 2010 Data Warehouse and Analysis Services Database http://bit.ly/aG5oEh TFS 2005, 2008 and 2010 Compatibility http://shrinkster.com/1dhj

    Read the article

  • Survey Probes the Project Management Concerns of Financial Services Executives

    - by Melissa Centurio Lopes
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Do you wonder what are the top reasons why large projects in the financial industry fail to meet budgets, schedules, and other key performance criteria? Being able to answer this question can provide important insight and value of good project management practices for your organization. According to 400 senior executives who participated in a new survey conducted by the Economist Intelligence Unit and sponsored by Oracle, unrealistic project goals is the main reason for roadblocks to success Other common stumbling blocks are poor alignment between project and organizational goals, inadequate human resources, lack of strong leadership, and unwillingness among team members to point out problems. This survey sample also had a lot to say about the impact of regulatory compliance on the overall portfolio management process. Thirty-nine percent acknowledged that regulations enabled efficient functioning of their businesses. But a similar number said that regulations often require more financial resources than were originally allocated to bring projects in on time. Regulations were seen by 35 percent of the executives as roadblocks to their ability to invest in the organization’s growth and success. These revelations among others are discussed in depth in a new on-demand Webcast titled “Too Good to Fail: Developing Project Management Expertise in Financial Services” now available from Oracle. The Webcast features Brian Gardner, editor of the Economist Intelligence Unit, who presents these findings from this survey along with Guy Barlow, director of industry strategy for Oracle Primavera. Together, they analyze what the numbers mean for project and program managers and the financial services industry. Register today to watch the on-demand Webcast and get a full rundown and analysis of the survey results. Take the Economist Intelligence Unit benchmarking survey and see how your views compare with those of other financial services industry executives in ensuring project success.  Read more in the October Edition of the quarterly Information InDepth EPPM Newsletter

    Read the article

  • MAXDOP in SQL Azure

    - by Herve Roggero
    In my search of better understanding the scalability options of SQL Azure I stumbled on an interesting aspect: Query Hints in SQL Azure. More specifically, the MAXDOP hint. A few years ago I did a lot of analysis on this query hint (see article on SQL Server Central:  http://www.sqlservercentral.com/articles/Configuring/managingmaxdegreeofparallelism/1029/).  Here is a quick synopsis of MAXDOP: It is a query hint you use when issuing a SQL statement that provides you control with how many processors SQL Server will use to execute the query. For complex queries with lots of I/O requirements, more CPUs can mean faster parallel searches. However the impact can be drastic on other running threads/processes. If your query takes all available processors at 100% for 5 minutes... guess what... nothing else works. The bottom line is that more is not always better. The use of MAXDOP is more art than science... and a whole lot of testing; it depends on two things: the underlying hardware architecture and the application design. So there isn't a magic number that will work for everyone... except 1... :) Let me explain. The rules of engagements are different. SQL Azure is about sharing. Yep... you are forced to nice with your neighbors.  To achieve this goal SQL Azure sets the MAXDOP to 1 by default, and ignores the use of the MAXDOP hint altogether. That means that all you queries will use one and only one processor.  It really isn't such a bad thing however. Keep in mind that in some of the largest SQL Server implementations MAXDOP is usually also set to 1. It is a well known configuration setting for large scale implementations. The reason is precisely to prevent rogue statements (like a SELECT * FROM HISTORY) from bringing down your systems (like a report that should have been running on a different in the first place) and to avoid the overhead generated by executing too many parallel queries that could cause internal memory management nightmares to the host Operating System. Is summary, forcing the MAXDOP to 1 in SQL Azure makes sense; it ensures that your database will continue to function normally even if one of the other tenants on the same server is running massive queries that would otherwise bring you down. Last but not least, keep in mind as well that when you test your database code for performance on-premise, make sure to set the DOP to 1 on your SQL Server databases to simulate SQL Azure conditions.

    Read the article

  • Going for Gold

    - by Simple-Talk Editorial Team
    There was a spring in the step of some members of our development teams here at Red Gate, on hearing that on five gold awards at 2012′s SQL Mag Community and Editors Choice Awards. And why not? It’s a nice recognition that their efforts were appreciated by many in the SQL Server community. The team at Simple-Talk don’t tend to spring, but even we felt a twinge of pride in the fact that SQL Scripts Manager received Gold for Editor’s Choice in the Best Free Tools category. The tool began life as a “Down Tools” project and is one that we’ve supported and championed in various articles on Simple-talk.com. Over a Cambridge Bitter in the Waggon and Horses, we’ve often reflected on how nice it would be to nominate our own awards. Of course, we’d have to avoid nominating Red Gate tools in each category, even the free ones, for fear of seeming biased,  but we could still award other people’s free tools, couldn’t we? So allow us to set the stage for the annual Simple-Talk Community Tool awards… Onto the platform we shuffle, to applause from the audience; Chris in immaculate tuxedo, Alice in stunning evening gown, Dave and Tony looking vaguely uncomfortable, Andrew somehow distracted, as if his mind is elsewhere. Tony strides up to the lectern, and coughs lightly…”In the free-tool category we have the three nominations, and they are…” (rustle of the envelope opening) Ola Hallengren’s SQL Server Maintenance Solution (applause) Adam Machanic’s WhoIsActive (cheers, more applause) Brent Ozar’s sp_Blitz (much clapping) “Before we declare the winner, I’d like to say a few words in recognition of a grand tradition in a SQL Server community that continues to offer its members a steady supply of excellent, free tools. It hammers home the fundamental principle that a tool should solve a single, pressing and frustrating problem, but you should only ever build your own solution to that problem if you are certain that you cannot buy it, or that someone has not already provided it free. We have only three finalists tonight, but I feel compelled to mention a few other tools that we also use and appreciate, such as Microsoft’s Logparser, Open source Curl, Microsoft’s TableDiff.exe, Performance Analysis of Logs (PAL) Tool, SQL Server Cache Manager and SQLPSX.” “And now I’ll hand over to Alice to announce the winner.” Alice strides over to the microphone, tearing open the envelope. “The winner,” she pauses for dramatic effect “… is …Ola Hallengren’s SQL Server Maintenance Solution!” Queue much applause and consumption of champagne. Did we get it wrong? What free tool would you nominate? Let us know! Cheers, Simple-Talk Editorial Team (Andrew, Alice, Chris, Dave, Tony)

    Read the article

  • Changing Focus on my Blog

    - by D'Arcy Lussier
    I try to limit these types of blog posts – the ones where I communicate some change as if I have a loyal subscriber base that will be somehow affected. Still, I think its of worth if for nothing else than to document for myself an acknowledgement that my career is evolving. For the last who knows how long, I’ve had this as my banner: It’s funny how technology focuses change over time. 3.5 – 4 years ago I was wanting to immerse myself in BizTalk. Then I shifted, focussing on Silverlight. I even started a short-lived Silverlight user group here in Winnipeg that had, IMO, one of the *best* UG logos ever (do a Google search for the old school Winnipeg Jets logo if you don’t catch the reference)… And even how I identified myself – as a Developer – isn’t really accurate anymore as I’ve shifted more into an architect/analyst role at Online Business Systems as well as getting much more involved in business development. So I’m switching the focus of this blog a bit. Nothing too great, but you’ll find my posts aren’t necessarily tied to a technology or platform. Instead I’ll be focussing on current passions and interests. Solution Architecture Before a line of code is written, a solution is envisioned. The process of performing solution analysis and architecture is an intriguing process that encompasses negotiation and interpersonal skills as much as technical knowledge. Business & Entrepreneurship Creating things, building things, and working with others – business is fascinating and exciting! Entrepreneurship, and intrapreneurship, are growing trends that I’ve been exploring over the last few years through my conference (www.prairiedevcon.com) and within Online. Microsoft At Online one of my roles is “Microsoft Practice Lead” and my entire career has been built around the Microsoft stack of technologies. That focus won’t change here on my blog, and there’s tonnes of exciting new products and technologies coming out of Redmond. Adoption This is a very personal subject that’s extremely close to my heart. I’m not talking about technology adoption, I’m talking about human adoption. Almost three years ago we adopted our first daughter, Sadie, and two years ago we adopted our second daughter, Skylar; an amazing new chapter in my life as I became a “parent”. Adoption is very much misunderstood, and many people have questions about it. Hopefully I can shed some light into our experiences and provide some guidance for those that are looking into it. So come along with me as I start chronicling the next phase of my career and life.

    Read the article

  • Oracle Customer Hub - Directions, Roadmap and Customer Success

    - by Mala Narasimharajan
     By Gurinder Bahl With less than a week from OOW 2012, I would like to introduce you all to the core Oracle Customer MDM Strategy sessions. Fragmentation of customer data across disparate systems prohibits companies from achieving a complete and accurate view of their customers. Oracle Customer Hub provide a comprehensive set of services, utilities and applications to create and maintain a trusted master customer system of record across the enterprise. Customer Hub centralizes customer data from disparate systems across your enterprise into a master repository. Existing systems are integrated in real-time or via batch with the Hub, allowing you to leverage legacy platform investments while capitalizing on the benefits of a single customer identity. Don’t miss out on two sessions geared towards Oracle Customer Hub:   1) Attend session CON9747 - Turn Customer Data into an Enterprise Asset with Oracle Fusion Customer Hub Applications at Oracle Open World 2012 on Monday, Oct 1st, 10:45 AM - 11:45 AM @ Moscone West – 2008. Manouj Tahiliani, Sr. Director MDM Product Management will provide insight into the vision of Oracle Fusion Customer Hub solutions, and review the roadmap. You will discover how Fusion Customer MDM can help your enterprise improve data quality, create accurate and complete customer information,  manage governance and help create great customer experiences. You will also understand how to leverage data quality capabilities and create a sophisticated customer foundation within Oracle Fusion Applications. You will also hear Danette Patterson, Group Lead, Church Pension Group talk about how Oracle Fusion Customer Hub applications provide a modern, next-generation, multi-domain foundation for managing customer information in a private cloud. 2)  Don't miss session  CON9692 - Customer MDM is key to Strategic Business Success and Customer Experience Management at Oracle Open World 2012 on Wednesday, October 3rd 2012 from 3:30-4:30pm @ Westin San Francisco Metropolitan 1. JP Hurtado, Director, Customer Systems, will provide insight on how RCCL overcame challenges of data quality, guest recognition & centralized customer view to provide consolidated customer view to multiple reservation, CRM, marketing, service, sales, data warehouse and loyalty systems. You will learn how Royal Caribbean Cruise Lines (RCCL), which has over 30 million customer and maintain multiple brands, leveraged Oracle Customer Hub (Siebel UCM) as backbone to customer data management strategy for past 5 years. Gurinder Bahl from MDM Product Management will provide an update on Oracle Customer Hub strategy, what we have achieved since last Open World and our future plans for the Oracle Customer Hub. You will learn about Customer Hub Data Quality capabilities around data analysis, cleansing, matching, address validation as well as reporting and monitoring capabilities. The MDM track at Oracle Open World covers variety of topics related to MDM. In addition to the product management team presenting product updates and roadmap, we have several Customer Panels, and Conference sessions. You can see an overview of MDM sessions here.  Looking forward to see you at Open World, the perfect opportunity to learn about cutting edge Oracle technologies. 

    Read the article

  • Does Scrum turn active developers into passive developers?

    - by Saeed Neamati
    I'm a web developer working in a team of three developers and one designer. It's now about five months that we've implemented the agile scrum software development methodology. But I have a weird feeling I just wanted to share in this site. One important factor in human life is decision-making process. However, there is a big difference in decisions you make. Some decisions are just the outcome of an internal or external force, while other decisions are completely based on your free will, and some decisions are simply something in between. The more freedom you have in making decisions, the more self-driven your work would become. This seems to be a rule. Because we tend to shape our lives ourselves. There is a big difference between you deciding what to do, or being told what to do. Before scrum, I felt like having more freedom in making the decisions which were related to development, analysis, prioritizing implementation, etc. I had more feeling like I'm deciding what I'm doing. However, due to the scrum methodology, now many decisions simply come from the product owner. He prioritizes PBIs, he analyzes how the software should work, even sometimes how the UI and functionality should be implemented. I know that this is part of the scrum methodology, and I also know that this may result in better sales of product in future. However, I now feel like I'm always getting told to do something, instead of deciding to do something. This syndrome now has made me more passive towards the work. I tend to search less to find a better solution, approach, or technique I don't wake up in the morning expecting to get to an enjoyable work. Rather, I feel like being forced to work in order to live I have more hunger to work on my own hobby projects after work I won't push the team anymore to get to the higher technological levels I spend more time now on dinner, or tea-times and have less enthusiasm to get back to work I'm now willing more for the work to finish sooner, so that I can get home The big problem is, I see and diagnose this behavior in my colleagues too. Is it the outcome of scrum? Does scrum really makes the development team feel like they have no part in forming the overall software, thus making the passive to the project? How can I overcome this feeling?

    Read the article

  • TSAM 11gR1

    - by todd.little
    The Tuxedo System and Application Monitor (TSAM) 11gR1 release provides powerful new application monitoring capabilities, as well as significant improvements in ease of use. The first thing users will notice is the completely redesigned user interface in the TSAM console. Based on Oracle ADF, the console is much easier to navigate, provides a Web 2.0 style interface with dynamically updating panels, and a look and feel familiar to those that have used Oracle Enterprise Manager. Monitoring data can be viewed in both tabular and graphical form and exported to Excel for further analysis. A number of new metrics are collected and displayed in this release. Call path monitoring now displays CPU time, message size, total transport time, and client address giving even more end-to-end information about a specific Tuxedo request. As well the call path display has been completely revamped to make it much easier to see the branches of the call path. The call pattern display now provides statistics on successful vs failed calls, system and application failures, and end-to-end average elapsed time. Service monitoring now displays minimum and maximum message size, CPU usage, and client address. System server monitoring now includes monitoring the SALT gateway servers to provide detailed performance metrics about those servers. Perhaps the most significant new feature is the consolidation of alert definitions and policy management. In previous versions of TSAM, some alerts were defined and checked on the monitored systems while others were defined and checked in the console. Policy management could be performed on both the monitored node via environment variable or command, as well as from the console. Now all alert definitions and policy definitions are only made using the console. For alerts this means that regardless of where the alert is evaluated it is defined in one and only one place. Thus the plug-in alert mechanism of previous releases can now be managed using the TSAM console, making SLA alert definition much easier and cleaner. Finally there is support in TSAM for monitoring rehosted mainframe applications. The newly announced Oracle Tuxedo Application Runtime for CICS and Batch can be monitored in the TSAM console using traditional mainframe views of the application such as regions. Look for a future blog entry with more details on this as well as some entries providing a glimpse of the console. TSAM gives users a single point for monitoring the performance of all of their Tuxedo applications.

    Read the article

  • Windows Azure Use Case: High-Performance Computing (HPC)

    - by BuckWoody
    This is one in a series of posts on when and where to use a distributed architecture design in your organization's computing needs. You can find the main post here: http://blogs.msdn.com/b/buckwoody/archive/2011/01/18/windows-azure-and-sql-azure-use-cases.aspx  Description: High-Performance Computing (also called Technical Computing) at its most simplistic is a layout of computer workloads where a “head node” accepts work requests, and parses them out to “worker nodes'”. This is useful in cases such as scientific simulations, drug research, MatLab work and where other large compute loads are required. It’s not the immediate-result type computing many are used to; instead, a “job” or group of work requests is sent to a cluster of computers and the worker nodes work on individual parts of the calculations and return the work to the scheduler or head node for the requestor in a batch-request fashion. This is typical to the way that many mainframe computing use-cases work. You can use commodity-based computers to create an HPC Cluster, such as the Linux application called Beowulf, and Microsoft has a server product for HPC using standard computers, called the Windows Compute Cluster that you can read more about here. The issue with HPC (from any vendor) that some organization have is the amount of compute nodes they need. Having too many results in excess infrastructure, including computers, buildings, storage, heat and so on. Having too few means that the work is slower, and takes longer to return a result to the calling application. Unless there is a consistent level of work requested, predicting the number of nodes is problematic. Implementation: Recently, Microsoft announced an internal partnership between the HPC group (Now called the Technical Computing Group) and Windows Azure. You now have two options for implementing an HPC environment using Windows. You can extend the current infrastructure you have for HPC by adding in Compute Nodes in Windows Azure, using a “Broker Node”.  You can then purchase time for adding machines, and then stop paying for them when the work is completed. This is a common pattern in groups that have a constant need for HPC, but need to “burst” that load count under certain conditions. The second option is to install only a Head Node and a Broker Node onsite, and host all Compute Nodes in Windows Azure. This is often the pattern for organizations that need HPC on a scheduled and periodic basis, such as financial analysis or actuarial table calculations. References: Blog entry on Hybrid HPC with Windows Azure: http://blogs.msdn.com/b/ignitionshowcase/archive/2010/12/13/high-performance-computing-on-premise-and-in-the-windows-azure-cloud.aspx  Links for further research on HPC, includes Windows Azure information: http://blogs.msdn.com/b/ncdevguy/archive/2011/02/16/handy-links-for-hpc-and-azure.aspx 

    Read the article

  • SQLAuthority News – SQL Server 2012 Upgrade Technical Guide – A Comprehensive Whitepaper – (454 pages – 9 MB)

    - by pinaldave
    Microsoft has just released SQL Server 2012 Upgrade Technical Guide. This guide is very comprehensive and covers the subject of upgrade in-depth. This is indeed a helpful detailed white paper. Even writing a summary of this white paper would take over 100 pages. This further proves that SQL Server 2012 is quite an important release from Microsoft. This white paper discusses how to upgrade from SQL Server 2008/R2 to SQL Server 2012. I love how it starts with the most interesting and basic discussion of upgrade strategies: 1) In-place upgrades, 2) Side by side upgrade, 3) One-server, and 4) Two-server. This whitepaper is not just pure theory but is also an excellent source for some tips and tricks. Here is an example of a good tip from the paper: “If you want to upgrade just one database from a legacy instance of SQL Server and not upgrade the other databases on the server, use the side-by-side upgrade method instead of the in-place method.” There are so many trivia, tips and tricks that make creating the list seems humanly impossible given a short period of time. My friend Vinod Kumar, an SQL Server expert, wrote a very interesting article on SQL Server 2012 Upgrade before. In that article, Vinod addressed the most interesting and practical questions related to upgrades. He started with the fundamentals of how to start backup before upgrade and ended with fail-safe strategies after the upgrade is over. He covered end-to-end concepts in his blog posts in simple words in extremely precise statements. A successful upgrade uses a cycle of: planning, document process, testing, refine process, testing, planning upgrade window, execution, verifying of upgrade and opening for business. If you are at Vinod’s blog post, I suggest you go all the way down and collect the gold mine of most important links. I have bookmarked the blog by blogging about it and I suggest that you bookmark it as well with the way you prefer. Vinod Kumar’s blog post on SQL Server 2012 Upgrade Technical Guide SQL Server 2012 Upgrade Technical Guide is a detailed resource that’s also available online for free. Each chapter was carefully crafted and explained in detail. Here is a quick list of the chapters included in the whitepaper. Before downloading the guide, beware of its size of 9 MB and 454 pages. Here’s the list of chapters: Chapter 1: Upgrade Planning and Deployment Chapter 2: Management Tools Chapter 3: Relational Databases Chapter 4: High Availability Chapter 5: Database Security Chapter 6: Full-Text Search Chapter 7: Service Broker Chapter 8: SQL Server Express Chapter 9: SQL Server Data Tools Chapter 10: Transact-SQL Queries Chapter 11: Spatial Data Chapter 12: XML and XQuery Chapter 13: CLR Chapter 14: SQL Server Management Objects Chapter 15: Business Intelligence Tools Chapter 16: Analysis Services Chapter 17: Integration Services Chapter 18: Reporting Services Chapter 19: Data Mining Chapter 20: Other Microsoft Applications and Platforms Appendix 1: Version and Edition Upgrade Paths Appendix 2: SQL Server 2012: Upgrade Planning Checklist Download SQL Server 2012 Upgrade Technical Guide [454 pages and 9 MB] Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Database, DBA, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, SQLServer, T SQL, Technology

    Read the article

  • Tellago announces SQL Server 2008 R2 BI quick adoption programs

    - by Vishal
    During the last year, we (Tellago) have been involved in various business intelligence initiatives that leverage some emerging BI techniques such as self-service BI or complex event processing (CEP). Specifically, in the last few months, we have partnered with Microsoft to deliver a series of events across the country where we present the different technologies of the SQL Server 2008 R2 BI stack such as PowerPivot, StreamInsight, Ad-Hoc Reporting and Master Data Services. As part of those events, we try to go beyond the traditional technology presentation and provide a series of best practices and lessons we have learned on real world BI projects that leverage these technologies. Now that SQL Server 2008 R2 has been released to manufacturing, we have launched a series of quick adoption programs that are designed to help customers understand how they can embrace the newest additions to Microsoft's BI stack as part of their IT initiatives. The programs are also designed to help customers understand how the new SQL Server features interact with established technologies such as SQL Server Analysis Services or SQL Server Integration Services. We try to keep these adoption programs very practical by doing a lot of prototyping and design sessions that will give our customers a practical glimpse of the capabilities of the technologies and how they can fit in their enterprise architecture roadmap. Here is our official announcement (you can blame my business partner, BI enthusiast, and Tellago's CEO Elizabeth Redding for the marketing pitch ;)): Tellago Marks Microsoft's SQL Server 2008 R2 Launch With Business Intelligence Quick Adoption Program Microsoft launched SQL Server 2008 R2 last week, which delivers several breakthrough business intelligence (BI) capabilities that enable organizations to:  Efficiently process, analyze and mine data Improve IT and developer efficiency Enable highly scalable and well-managed Business Intelligence on a self-service basis for business users The release offers a new feature called PowerPivot, which enables self service BI through connecting business users directly to enterprise data sources and providing improved reporting and analytics. The release also offers Master Data Management which helps enterprises centrally manage critical data assets company-wide and across diverse systems, enabling increased integrity of information over time. Finally, the release includes StreamInsight, which is a framework for implementing Complex Event Processing (CEP) applications on the Microsoft platform. With StreamInsight, IT organizations can implement the infrastructure to process a large volume of events near real time, execute continuous queries against event streams and enable real time business intelligence. As a thought leader in the Business Intelligence community, Tellago has recognized the occasion by launching a series of quick adoption programs to enable the adoption of this new BI technology stack in your enterprise. Our Quick Adoption programs are designed to help you: Brainstorm BI solution options  Architect initial infrastructure components Prototype key features of a solution As a 2-3 day program, our approach is more efficient and cost effective than a traditional Proof of Concept because it allows you to understand the new SQL Server 2008 R2 feature set  while seeing directly how you can leverage it for your business intelligence needs. If you are interested in learning more about the BI capabilities of Microsoft's Business Intelligence stack, including SQL Server 2008 R2, we can help.  As industry experts and software content advisers to Microsoft, Tellago is the place where ideas meet technology expertise.  Let us help you see for yourself the advantages that you can gain from Microsoft's  SQL Server 2008 R2. Email or call for more information - [email protected] or 847-925-2399.

    Read the article

  • Gawker Passwords

    - by Nick Harrison
    There has been much news about the hack of the Gawker web sites. There has even been an analysis of the common passwords found. This list is embarrassing in many ways. The most common password was "123456". The second most common password was "password". Much has also been written providing advice on how to create good passwords. This article provides some interesting advice, none of which should be taken. Anyone reading my blog, probably already knows the importance of strong passwords, so I am not going to reiterate the reasons here. My target audience is more the folks defining password complexity requirements. A user cannot come up with a strong password, if we have complexity requirements that don't make sense. With that in mind, here are a few guidelines:  Long Passwords Insist on long passwords. In some cases, you may need to change to allow a long password. I have seen many places that cap passwords at 8 characters. Passwords need to be at least 8 characters minimal. Consider how much stronger the passwords would be if you double the length. Passwords that are 15-20 characters will be that much harder to crack. There is no need to have limit passwords to 8 characters. Don't Require Special Characters Many complexity rules will require that your password include a capital letter, a lower case letter, a number, and one of the "special" characters, the shits above the number keys. The problem with such rules is that the resulting passwords are harder to remember. It also means that you will have a smaller set of characters in the resulting passwords. If you must include one of the 9 digits and one of the 9 "special" characters, then you have dramatically reduced the character set that will make up the final password. Two characters will be one of 10 possible values instead of one of 70. Two additional characters will be one of 26 possible characters instead of a 70 character potential character set. If you limit passwords to 8 characters, you are left with only 7 characters having the full set of 70 potential values. With these character restrictions in place, there are 1.6 x1012 possible passwords. Without these special character restrictions, but allowing numbers and special characters, you get a total of 5.76x1014 possible passwords. Even if you only allowed upper and lower case characters, you will still have 2.18X1014 passwords. You can do the math any number of ways, requiring special characters will always weaken passwords. Now imagine the number of passwords when you require more than 8 characters.  If you are responsible for defining complexity rules, I urge you to take these guidelines into account. What other guidelines do you follow?

    Read the article

  • Today's Links (6/20/2011)

    - by Bob Rhubart
    Why your security sucks | Eric Knorr A conversation with InfoWorld security expert Roger Grimes reveals why the latest burst of attacks is just business as usual. JDev 11g R2 - ADF BC Dependency Diagram Feature | Andrejus Baranovskis Oracle ACE Director Andrejus Baranovkis continues his exploration of JDeveloper 11g R2. Mobile Apps Put the Web in Their Rear-view Mirror | Charles Newark-French "Our analysis shows that, for the first time ever, daily time spent in mobile apps surpasses desktop and mobile web consumption," says Newark-French. "This stat is even more remarkable if you consider that it took less than three years for native mobile apps to achieve this level of usage, driven primarily by the popularity of iOS and Android platforms." Vivek Kundra, a public servant who gets stuff done | Craig Newmark Craigslist founder Craig Newmark bids farewell to the nation's first CIO. Weblogic, QBrowser and topics | Eric Elzinga Elzinga says: "Besides using the Weblogic Console to add subscribers to our topics we can also use QBrowser to browse queues and topics on your Weblogic Server." Java EE talks at JAX Conf | Arun Gupta Arun Gupta shares links to several Java EE presentations taking place at this week's Jax Conference in San Jose, CA. Development gotchas and silver bullets | Andy Mulholland Mulholland explains why "Software development has to change to fit with new business practices!" Oracle is Proud Sponsor of Gartner Security and Risk Management Summit 2011 | Troy Kitch Oracle will have a very strong presence at this year’s Gartner Security and Risk Management Summit 2011 in Washington D.C., June 20-23. Database Web Service using Toplink DB Provider | Vishal Jain "With JDeveloper 11gR2 you can now create database based web services using JAX-WS Provider," says Jain. Sample Chapter: A Fusion Applications Technical Overview An excerpt from "Managing Oracle Fusion Applications" by Richard Bingham, published by Oracle Press, May 2011. White Paper: Oracle Optimized Solution for Enterprise Cloud Infrastructure This paper provides recommendations and best practices for optimizing virtualization infrastructures when deploying the Oracle Enterprise Cloud Infrastructure. White paper: Oracle Optimized Solution for Lifecycle Content Management Authors Donna Harland and Nick Klosk illustrate how Oracle Enterprise Content Management Suite and Oracle’s Sun Storage Archive Manager work Oracle’s Sun hardware. Bay Area Coherence Special Interest Group Date: Thursday, July 21, 2011 Time: 4:30pm - 8:15pm ET - Note that Parking at 475 Sansome Closes at 8:30pm Location: Oracle Office,475 Sansome Street, San Francisco, CA Google Map Speakers: Chris Akker, Solutions Engineer, F5 Paul Cleary, Application Architect, Oracle Alexey Ragozin, Independent Consultant Brian Oliver, Oracle

    Read the article

  • BI&EPM in Focus June 2013

    - by Mike.Hallett(at)Oracle-BI&EPM
    Analyst Report from Ovum: BI bites into a bigger slice of Oracle’s Red Stack Customers INC Research Ensures 24/7 Enterprise Application Availability and Supports Rapid Expansion in Asia with Managed Cloud Services – Hyperion Planning, PeopleSoft, E-Business Suite, SOA Suite PL Developments Improves Quality and Demand Planning Accuracy, Streamlines Compliance as It Moves into Manufacturing – Hyperion Planning, OBIEE, E-Business Suite Release 12.1, Agile, Demantra Kiabi Provides Store Managers with Monthly Earnings Statements in Four Business Days to Support Continued Retail Growth – Hyperion Planning, Hyperion Financial Reporting, Hyperion Smart View for Office Speedy Cash Improves Global Financial Budgeting and Forecasting to Support Continued Company Growth - Hyperion Planning, Essbase, Hyperion Smart View for Office, Hyperion Financial Management Grupo Sports World Automates and Reduces Budget Consolidation Time by 33% for 30 Fitness Centers – Hyperion Planning Jupiter Shop Channel Automates Budgeting Processes, Enhances Visibility of Project Investments to Support Strategic Decision-Making – Hyperion Planning GENBAND Saves US$1.25 Million Annually with Automated Global Trade Management, Gains Compliance Assurance – Hyperion Financial Management, E-Business Suite Aldar Properties Consolidates and Simplifies Group Planning and Reporting for Business and Finance Structures with Integrated ERP and Business Intelligence – Hyperion Planning, Essbase, Data Integrator, OBIEE, E-Business Suite, SUN Link to Complete Archive Enterprise Performance Management Hyperion EPM 11.1.2.3 Webcast Tutorials EPM Blog: Three Technologies CFOs Need to Know About The CFO as Catalyst for Change - Part 1 The CFO as Catalyst for Change - Part 2 Actions Speak Louder in Scorecards Unlocking Business Potential with Enterprise Performance Management Business Intelligence Oracle Database 12c is launched Analysis: How to Take Big Data Advantage of Oracle Database 12c by Data-informed.com Normal 0 false false false EN-GB X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-fareast-language:EN-US;}

    Read the article

  • Scrum and Team Consolidation

    - by John K. Hines
    I’m still working my way through one of the more painful team consolidations of my career.  One thing that’s made it hard was my assumption that the use of Agile methods and Scrum would make everything easy.  Take three teams, make all work visible, track it, and presto: An efficient, functioning software development team. What I’ve come to realize is that the primary benefit of Scrum is that Scrum brings teams closer to their customers.  Frequent meetings, short iterations, and phased deployments are all meant to keep the customer in the loop.  It’s true that as teams become proficient with Scrum they tend to become more efficient.  But I don’t think it’s true that Scrum automatically helps people work together. Instead, Scrum can point out when teams aren’t good at working together.   And it really illustrates when teams, especially teams in sustaining mode, are reacting to their customers instead of innovating with them.  At the moment we’ve inherited a huge backlog of tools, processes, and personalities.  It’s up to us to sort them all out.  Unfortunately, after 7 &frac12; months we’re still sorting. What I’d recommend for any blended team is to look at your current product lifecycles and work on a single lifecycle for all work.  If you can’t objectively come up with one process, that’s a good indication that the new team might not be a good fit for being a single unit (which happens all the time in bigger companies).  Go ahead & self-organize into sub-teams.  Then repeat the process. If you can come up with a single process, tackle each piece and standardize all of them.  Do this as soon as possible, as it can be uncomfortable.  Standardize your requirements gathering and tracking, your exploration and technical analysis, your project planning, development standards, validation and sustaining processes.  Standardize all of it.  Make this your top priority, get it out of the way, and get back to work. Lastly, managers of blended teams should realize what I’m suggesting is a disruptive process.  But you’ve just reorganized the team is already disrupted.   Don’t pull the bandage off slowly and force the team through a prolonged transition phase, lowering their productivity over the long term.  You can role model leadership to your team and drive a true consolidation.  Destroy roadblocks, reassure those on your team who are afraid of change, and push forward to create something efficient and beautiful.  Then use Scrum to reengage your customers in a way that they’ll love. Technorati tags: Scrum Scrum Process

    Read the article

  • July, the 31 Days of SQL Server DMO’s – Day 24 (sys.dm_db_index_operational_stats)

    - by Tamarick Hill
    The sys.dm_db_index_operational_stats Dynamic Management Function returns information about the IO, locking, and access methods for the indexes that you currently have on your SQL Server Instance. This function takes four input parameters which are (1) database_id, (2) object_id, (3) index_id, and (4) partition_number. Let’s have a look at the results from this function against our AdventureWorks2012 database. This function returns a ton of columns, so not only will I not attempt to describe each of the columns, I wont even attempt to display all of them here. My query below will give you a subset of the columns returned from this function. SELECT database_id, object_id, index_id, partition_number, leaf_insert_count, leaf_delete_count, leaf_update_count, leaf_ghost_count, nonleaf_insert_count, nonleaf_delete_count, nonleaf_update_count, range_scan_count, forwarded_fetch_count, row_lock_count, row_lock_wait_count, page_lock_count, page_lock_wait_count, Index_lock_promotion_attempt_count, index_lock_promotion_count, page_compression_attempt_count, page_compression_success_count FROM sys.dm_db_index_operational_stats(db_id('AdventureWorks2012'), NULL, NULL, NULL) The first four columns in the result set represent the values that we passed in as our input parameters. If you use NULL’s as I did, then you will see results for every index on your system. I specified a database_id so my result set only shows those records pertaining to my AdventureWorks2012 database. The next columns in the result set provide you with information on how may inserts, deletes, or updates that have taken place on your leaf and nonleaf index levels. The nonleaf levels would refer to the intermediate and root index levels. In the middle of these you see a leaf_ghost_count column, which represents the number of records that have been logically deleted and marked as “ghosted”  and are waiting on the background ghost cleanup process to physically remove them. The range_scan_count column represents the number of range or table scans that have been performed against an index. The forwarded_fetch_count column represents the number of rows that were returned from a forwarding row pointer. The row_lock_count and row_lock_wait_count represent the number of row locks that have been requested for an index and the number of times SQL has had to wait on a row lock respectively. The page_lock_count and page_lock_wait_count represent the number of page locks that have been requested for an index and the number of times SQL has had to wait on a page lock respectively. The index_lock_promotion_attempt_count represents the number of times the database engine has attempted to promote a lock to the index level. The index_lock_promotion_count column displays how many times that index lock promotion was successful. Lastly the page_compression_attempt_count and page_compression_success_count represents how many times a page was attempted to be compressed and how many times the attempt was successful. As you can see there is a ton of information returned from this DMV. The DMV we reviewed on yesterday (sys.dm_db_index_usage_stats) provided you with good information on when and how indexes have been used, but this DMF takes an even deeper dive into these statistics. If you are interested in performing a very detailed analysis on the operational stats of your indexes, this is not only a good place to start, but more than likely the best place. For more information on this Dynamic Management Function, please see the below Books Online link: http://msdn.microsoft.com/en-us/library/ms174281.aspx Follow me on Twitter @PrimeTimeDBA

    Read the article

  • Using SQL Source Control and Vault Professional Part 4

    - by Ajarn Mark Caldwell
    Two weeks ago I upgraded our installation of Fortress to the latest version, which is now named Vault Professional.  This is the version of Vault (i.e. Vault Standard 5.1 / Vault Professional 5.1) that will be officially supported with Red-Gate SQL Source Control 2.1.  While the folks at Red-Gate did a fantastic job of working with me to get SQL Source Control to work with the older Fortress version, we weren’t going to just sit on that.  There are a couple of things that Vault Professional cleaned up for us, such as improved integration with Visual Studio 2010, so it was a win all around. Shortly after that upgrade, I received notice from Red-Gate that they had a new Early Access version of SQL Source Control available that included the ability to source control static data.  The idea here is that you probably have a few fairly static lookup tables in your system, and those data values are similar in concept to source code, and should be versioned in your source control management system also.  I agree with this, but please be wise…somebody out there is bound to try to use this feature as their disaster recovery for their entire database, and that is NOT the purpose.  First off, you should never have your PROD (or LIVE, whatever you call it) system attached to source control.  Source Control is for development, not for PROD systems.  Second, use the features that are intended for this purpose, such as BACKUP and RESTORE. Laying that tangent aside, it is great that now you can include these critical values in your repository and make them part of a deployment process.  As you would guess, SQL Source Control uses SQL Data Compare to create the data change scripts just like it uses SQL Compare to create the schema change scripts.  Once again, they did a very good job with the integration to their other products.  At this point we are really starting to see some good payback on our investment in the full SQL Developer Bundle.  Those products were worth the investment back when we only used them sporadically for troubleshooting and DBA analysis, but now with SQL Source Control, they are becoming everyday-use products for the development team. I like this software (SQL Source Control) so much that I am about to break my own rules and distribute it to my team to use even though it is still in beta.  This is the first time that I have approved the use of any beta software in a production scenario (actively building our next versions of internal software) but I predict that the usability and productivity gain of using SQL Source Control over manual scripting is worth the risk.  Of course, I have also put this beta software through its paces pretty well to be comfortable with it, and Red-Gate has proven their responsiveness to issues that came up in my early beta testing, and so I am willing to bet on their continued support.  Likewise, SourceGear, the maker of Vault Professional, has proven itself to me as well, and so the combination of SQL Source Control with Vault Professional is the new standard for my development team.

    Read the article

  • Oracle VM Templates for EBS 12.1.3 for Exalogic Now Available

    - by Elke Phelps (Oracle Development)
    Oracle VM Templates for Oracle E-Business Suite 12.1.3 for x86 Exalogic Platform (64 bit) are now available on the Oracle Software Delivery Cloud.  The templates contain all the required elements to create an Oracle E-Business Suite R12 demonstration system on an Exalogic server. You can use these templates to quickly build an EBS 12.1.3 demonstration environment, bypassing the operating system and the software install (via the EBS Rapid Install).   The Oracle E-Business Suite Release 12.1.3 (64 bit) template for the Exalogic platform is a Oracle Virtual Server Guest template that contains a complete Oracle E-Business Suite Release 12.1.3 Database Tier and Application Tier Installation.  For additional details, please refer to the following My Oracle Support Note: Oracle E-Business Suite Release 12.1.3 Database Tier and Application Tier Template for Oracle Exalogic Platform (Note 1499132.1) The Oracle E-Business Suite system is installed on top of Oracle Linux Version 5 update 6. The templates have been optimized for performance, including OS kernel settings and E-Business Suite configuration settings tuned specifically for the Exalogic platform.  The configuration delivered with this template for a mid-tier running on Exalogic will support hundreds of concurrent users.  Please refer to Section 2: Performance Analysis in My Oracle Support Note 1499132.1 for additional details.   Additional Information The Oracle E-Business Suite VM templates for the Exalogic platform contain the following software versions: Operating System: Oracle Linux Version 5 Update 6 Oracle E-Business Suite 12.1.3 (Database Tier) Oracle E-Business Suite 12.1.3 (Application Tier) The following considerations were made when the Oracle E-Business Suite VM template for the Exalogic platform were designed: Templates use the hardware-virtualized architecture, supporting hardware with virtualization feature. Database Tier Template is configured to use the following configuration: 16 GB RAM 4 VCPUs 250 GB of Disk space for application installation Application Tier Template is configured to use the following configuration: 16 GB RAM 4 VCPUs 50 GB of Disk space for application installation References Oracle E-Business Suite Release 12.1.3 Database Tier and Application Tier Template for Oracle Exalogic Platform (Note 1499132.1) Related Articles Part 1: E-Business Suite 12.1.1 Templates for Oracle VM Now Available Part 2: Using Oracle VM with Oracle E-Business Suite Virtualization Kit Part 3: On Clouds and Virtualization in EBS Environments (OpenWorld 2009 Recap) Part 4: Deploying E-Business Suite on Amazon Web Services Elastic Compute Cloud Part 5: Live Migration of EBS Services Using Oracle VM Support Policies for Virtualization Technologies and Oracle E-Business Suite Virtualization and the E-Business Suite, Redux Virtualization and E-Business Suite

    Read the article

  • More details on America's Cup use of Oracle Data Mining

    - by charlie.berger
    BMW Oracle Racing's America's Cup: A Victory for Database Technology BMW Oracle Racing's victory in the 33rd America's Cup yacht race in February showcased the crew's extraordinary sailing expertise. But to hear them talk, the real stars weren't actually human. "The story of this race is in the technology," says Ian Burns, design coordinator for BMW Oracle Racing. Gathering and Mining Sailing DataFrom the drag-resistant hull to its 23-story wing sail, the BMW Oracle USA trimaran is a technological marvel. But to learn to sail it well, the crew needed to review enormous amounts of reliable data every time they took the boat for a test run. Burns and his team collected performance data from 250 sensors throughout the trimaran at the rate of 10 times per second. An hour of sailing alone generates 90 million data points.BMW Oracle Racing turned to Oracle Data Mining in Oracle Database 11g to extract maximum value from the data. Burns and his team reviewed and shared raw data with crew members daily using a Web application built in Oracle Application Express (Oracle APEX). "Someone would say, 'Wouldn't it be great if we could look at some new combination of numbers?' We could quickly build an Oracle Application Express application and share the information during the same meeting," says Burns. Analyzing Wind and Other Environmental ConditionsBurns then streamed the data to the Oracle Austin Data Center, where a dedicated team tackled deeper analysis. Because the data was collected in an Oracle Database, the Data Center team could dive straight into the analytics problems without having to do any extract, transform, and load processes or data conversion. And the many advanced data mining algorithms in Oracle Data Mining allowed the analytics team to build vital performance analytics. For example, the technology team could remove masking elements such as environmental conditions to give accurate data on the best mast rotation for certain wind conditions. Without the data mining, Burns says the boat wouldn't have run as fast. "The design of the boat was important, but once you've got it designed, the whole race is down to how the guys can use it," he says. "With Oracle database technology we could compare the incremental improvements in our performance from the first day of sailing to the very last day. With data mining we could check data against the things we saw, and we could find things that weren't otherwise easily observable and findable."

    Read the article

  • TechEd 2012: Day 3 &ndash; Morning TFS

    - by Tim Murphy
    My morning sessions for day three were dominated by Team Foundation Server.  This has been a hot topic for our clients lately, so this topic really stuck a chord. The speaker for the first session was from Boeing.  It was nice to hear how how a company mixes both agile and waterfall project management.   The approaches that he presented were very pragmatic.  For their needs reporting is the crucial part of their decision to use TFS.  This was interesting since this is probably the last aspect that most shops would think about. The challenge of getting users to adopt TFS was brought up by the audience.  As with the other discussion point he took a very level headed stance.  The approach he was prescribing was to eat the elephant a bite at a time instead of all at once.  If you try to convert you entire shop at once the culture shock will most likely kill the effort. Another key point he reminded us of is that you need to make sure that standards and compliance are taken into account when you setup TFS.  If you don’t implement a tool and processes around it that comply with the standards bodies that govern your business you are in for a world of hurt. Ultimately the reason they chose TFS was because it was the first tool that incorporated all the ALM features that they needed. Reduced licensing cost because of all the different tools they would need to buy to complete the same tasks.  They got to this point by doing an industry evaluation.  Although TFS came out on top he said that it still has a big gap is in the Java area.  Of course in this market there are vendors helping to close that gap. The second session was on how continuous feedback in agile is a new focus in VS2012.  The problems they intended to address included cycle time and average time to repair, root cause analysis. The speakers fired features at us as if they were firing a machine gun.  I will just say that I am looking forward to digging into the product after seeing this presentation.  Beyond that I will simply list some of the key features that caught my attention. Feature – Ability to link documents into tasks as artifacts Web access portal PowerPoint storyboards Exploratory testing Request feedback (allows users to record notes, screen shots and video/audio) See you after the second half. del.icio.us Tags: TechEd,TechEd 2012,TFS,Team Foundation Server

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >