Search Results

Search found 3471 results on 139 pages for 'dynamics ax 2009'.

Page 85/139 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • Mobile Developer Economics: Calling all Developers!

    - by Jacob Lehrbaum
    VisionMobile is currently soliciting feedback for the second edition of their popular Mobile Developer Economics report.  With the 2011 edition, VisionMobile is hoping to: "see how the dynamics of the developer world have changed since early 2010 and to provide more insights into app marketing, monetization and many other factors."If you have a couple of minutes, please check out the survey and provide your input.  In addition please help us spread the word to make sure that Java developers are well represented in this survey.  Thanks!

    Read the article

  • Have you ever wondered...?

    - by diana.gray
    I've often wondered why folks do the same thing over and over. For some of us, it's because we "don't get it" and there's an abundance of TV talk shows that will help us analyze the why of it. Dr. Phil is all too eager to ask "...and how's that working for you?". But I'm not referring to being stuck in a destructive pattern or denial. I'm really talking about doing something over and over because you have found a joy, a comfort, a boost of energy from an activity or event. For example, how many times have I planted bulbs in November or December only to be amazed by their reach, colors, and fragrance in early spring? Or baked fresh cookies and allowed the aroma to fill the house? Or kissed a sleeping baby held gently in my arms and being reminded of how tiny and fragile we all are. I've often wondered why it is that I get so much out of something I've done so many times. I think it's because I've changed. The activity may be the same but in the preceding days, months and years I've had new experiences, challenges, joys and sorrows that have shaped me. I'm different. The same is true about attending the Professional Businesswomen of California (PBWC) conference. Although the conference is an annual event held at San Francisco's Moscone Center, I still enjoy being with 3,000 other women like me. Yes, we work at different companies and in different industries, have different lifestyles and are at different stages in our professional careers and personal lives; but we are all alike in that we bring the NEW me each year that we attend. This year I can cheer when Safra Catz, President of Oracle, encourages us to trust our intuition; that "if something doesn't make sense, it doesn't make sense". And I can warmly introduce myself to Lisa Askins, Cheryl Melching's business partner at Center Stage Group, when I would have been too intimated to do so last year. This year I can commit to new challenges such as "no whining, no excuses and no gossip" as suggested by Roxanne Emmerich, a goal that I would have wavered on last year. I can also embrace the suggestion given by Dr. Ian Smith to "spend one hour each day" on me - giving myself time to rejuvenate. A friend, when asked if she was attending PBWC this year, said "I've attended the conference several times and there's nothing new!" My perspective is that WE are what makes PBWC's annual conference new. We are far different in 2010 than we were in 2009. We are learning, growing, developing and shedding and that's what makes the conference fresh, vibrant, rewarding, and lasting. It is the diversity of women coming together that makes it new. By sharing our experiences, we discover. By meeting with one another professionally and personally, we connect. And by applying the wisdom learned, we shine. We are reNEW-ed. It shows in our fresh ideas, confident interactions, strategic decisions and successful businesses. This refreshed approach is what our companies want and need, our families depend on, our communities and nation look to for creative solutions to pressing concerns. Thanks Oracle for your continued support and thanks PBWC for providing an annual day to be reNEW-ed.

    Read the article

  • BizTalk Server 2010 Beta available

    - by Rajesh Charagandla
    BizTalk Server 2010 Beta - Click Here to Download Overview: BizTalk Server 2010 offers significant enhancements to help integrate heterogeneous Line-of-business systems with Windows .NET and SharePoint based applications to optimize user productivity, gain business efficiency and increase agility . BizTalk Server 2010 allow .Net developers to take advantage of BizTalk services right out of the box to rapidly build solutions that need to integrate transactions and data from applications like SAP, Mainframes, MS Dynamics and Oracle. Similarly SharePoint developers can seamlessly use BizTalk services directly through the new Business Connectivity Services in SharePoint 2010. BizTalk Server 2010 includes new data mapping & transformation tool to dramatically reduce the development time to mediate data exchange between disparate systems. It also provide a new single dashboard to manage performance parameters and streamline deployments from development to test to production. BizTalk 2010 includes new, scalable Trading Partner Management (TPM) model with a graphical interface for flexible management of business partner relationships and efficient on-boarding process.

    Read the article

  • Car Modelling for race game

    - by Mert Toka
    I am taking Computer Graphics course this semester and we have a video game competition. I am making racing game with simulated dynamics. Our professor told us that we don't have to do much of a modelling but since we haven't started the gaming part and since I have free time I want to model the car. My question is firstly which software do you recommend to design game components? I know Maya right now. Secondly, if I design the car or any other part, what should its polygon count in order to run game smoothly? I can design pretty much everything but I assume that it is hard to design low-poly models.

    Read the article

  • Guest blog: A Closer Look at Oracle Price Analytics by Will Hutchinson

    - by Takin Babaei
    Overview:  Price Analytics helps companies understand how much of each sale goes into discounts, special terms, and allowances. This visibility lets sales management see the panoply of discounts and start seeing whether each discount drives desired behavior. In Price Analytics monitors parts of the quote-to-order process, tracking quotes, including the whole price waterfall and seeing which result in orders. The “price waterfall” shows all discounts between list price and “pocket price”. Pocket price is the final price the vendor puts in its pocket after all discounts are taken. The value proposition: Based on benchmarks from leading consultancies and companies I have talked to, where they have studied the effects of discounting and started enforcing what many of them call “discount discipline”, they find they can increase the pocket price by 0.8-3%. Yes, in today’s zero or negative inflation environment, one can, through better monitoring of discounts, collect what amounts to a price rise of a few percent. We are not talking about selling more product, merely about collecting a higher pocket price without decreasing quantities sold. Higher prices fall straight to the bottom line. The best reference I have ever found for understanding this phenomenon comes from an article from the September-October 1992 issue of Harvard Business Review called “Managing Price, Gaining Profit” by Michael Marn and Robert Rosiello of McKinsey & Co. They describe the outsized impact price management has on bottom line performance compared to selling more product or cutting variable or fixed costs. Price Analytics manages what Marn and Rosiello call “transaction pricing”, namely the prices of a given transaction, as opposed to what is on the price list or pricing according to the value received. They make the point that if the vendor does not manage the price waterfall, customers will, to the vendor’s detriment. It also discusses its findings that in companies it studied, there was no correlation between discount levels and any indication of customer value. I urge you to read this article. What Price Analytics does: Price analytics looks at quotes the company issues and tracks them until either the quote is accepted or rejected or it expires. There are prebuilt adapters for EBS and Siebel as well as a universal adapter. The target audience includes pricing analysts, product managers, sales managers, and VP’s of sales, marketing, finance, and sales operations. It tracks how effective discounts have been, the win rate on quotes, how well pricing policies have been followed, customer and product profitability, and customer performance against commitments. It has the concept of price waterfall, the deal lifecycle, and price segmentation built into the product. These help product and sales managers understand their pricing and its effectiveness on driving revenue and profit. They also help understand how terms are adhered to during negotiations. They also help people understand what segments exist and how well they are adhered to. To help your company increase its profits and revenues, I urge you to look at this product. If you have questions, please contact me. Will HutchinsonMaster Principal Sales Consultant – Analytics, Oracle Corp. Will Hutchinson has worked in the business intelligence and data warehousing for over 25 years. He started building data warehouses in 1986 at Metaphor, advancing to running Metaphor UK’s sales consulting area. He also worked in A.T. Kearney’s business intelligence practice for over four years, running projects and providing training to new consultants in the IT practice. He also worked at Informatica and then Siebel, before coming to Oracle with the Siebel acquisition. He became Master Principal Sales Consultant in 2009. He has worked on developing ROI and TCO models for business intelligence for over ten years. Mr. Hutchinson has a BS degree in Chemical Engineering from Princeton University and an MBA in Finance from the University of Chicago.

    Read the article

  • Strategies for Indexing Custom Fields in RavenDB

    - by Adrian Thompson Phillips
    In the relational database world, if I was developing a CRM system and wanted to have the user add their own custom fields that are searchable, I could have tables that store the name of the new column, the data type and the value, etc. (which would be less inefficient to index) or I could use the less elegant (but more searchable) solution that software like Dynamics and SharePoint use, whereas I create a load of columns on my aggregate root called CustomInt1, CustomInt2, etc. (which looks dirty and has a limit of how many custom fields a user can have, but has indexing advantages). But my questions is this, in NoSQL databases, what would be the best way of achieving the same thing? My priority would be for searchability. So what would be the best way to store this data? If I used a predefined set of properties (i.e. CustomData1, CustomData2, etc.), because these are all stored as JSON (i.e. strings) in the database, does this make it simpler because I don't have to worry about data types?

    Read the article

  • SQL Server Date Comparison Functions

    - by HighAltitudeCoder
    A few months ago, I found myself working with a repetitive cursor that looped until the data had been manipulated enough times that it was finally correct.  The cursor was heavily dependent upon dates, every time requiring the earlier of two (or several) dates in one stored procedure, while requiring the later of two dates in another stored procedure. In short what I needed was a function that would allow me to perform the following evaluation: WHERE MAX(Date1, Date2) < @SomeDate The problem is, the MAX() function in SQL Server does not perform this functionality.  So, I set out to put these functions together.  They are titled: EarlierOf() and LaterOf(). /**********************************************************                               EarlierOf.sql   **********************************************************/ /**********************************************************   Return the later of two DATETIME variables.   Parameter 1: DATETIME1 Parameter 2: DATETIME2   Works for a variety of DATETIME or NULL values. Even though comparisons with NULL are actually indeterminate, we know conceptually that NULL is not earlier or later than any other date provided.   SYNTAX: SELECT dbo.EarlierOf('1/1/2000','12/1/2009') SELECT dbo.EarlierOf('2009-12-01 00:00:00.000','2009-12-01 00:00:00.521') SELECT dbo.EarlierOf('11/15/2000',NULL) SELECT dbo.EarlierOf(NULL,'1/15/2004') SELECT dbo.EarlierOf(NULL,NULL)   **********************************************************/ USE AdventureWorks GO   IF EXISTS       (SELECT *       FROM sysobjects       WHERE name = 'EarlierOf'       AND xtype = 'FN'       ) BEGIN             DROP FUNCTION EarlierOf END GO   CREATE FUNCTION EarlierOf (       @Date1                              DATETIME,       @Date2                              DATETIME )   RETURNS DATETIME   AS BEGIN       DECLARE @ReturnDate     DATETIME         IF (@Date1 IS NULL AND @Date2 IS NULL)       BEGIN             SET @ReturnDate = NULL             GOTO EndOfFunction       END         ELSE IF (@Date1 IS NULL AND @Date2 IS NOT NULL)       BEGIN             SET @ReturnDate = @Date2             GOTO EndOfFunction       END         ELSE IF (@Date1 IS NOT NULL AND @Date2 IS NULL)       BEGIN             SET @ReturnDate = @Date1             GOTO EndOfFunction       END         ELSE       BEGIN             SET @ReturnDate = @Date1             IF @Date2 < @Date1                   SET @ReturnDate = @Date2             GOTO EndOfFunction       END         EndOfFunction:       RETURN @ReturnDate   END -- End Function GO   ---- Set Permissions --GRANT SELECT ON EarlierOf TO UserRole1 --GRANT SELECT ON EarlierOf TO UserRole2 --GO                                                                                             The inverse of this function is only slightly different. /**********************************************************                               LaterOf.sql   **********************************************************/ /**********************************************************   Return the later of two DATETIME variables.   Parameter 1: DATETIME1 Parameter 2: DATETIME2   Works for a variety of DATETIME or NULL values. Even though comparisons with NULL are actually indeterminate, we know conceptually that NULL is not earlier or later than any other date provided.   SYNTAX: SELECT dbo.LaterOf('1/1/2000','12/1/2009') SELECT dbo.LaterOf('2009-12-01 00:00:00.000','2009-12-01 00:00:00.521') SELECT dbo.LaterOf('11/15/2000',NULL) SELECT dbo.LaterOf(NULL,'1/15/2004') SELECT dbo.LaterOf(NULL,NULL)   **********************************************************/ USE AdventureWorks GO   IF EXISTS       (SELECT *       FROM sysobjects       WHERE name = 'LaterOf'       AND xtype = 'FN'       ) BEGIN             DROP FUNCTION LaterOf END GO   CREATE FUNCTION LaterOf (       @Date1                              DATETIME,       @Date2                              DATETIME )   RETURNS DATETIME   AS BEGIN       DECLARE @ReturnDate     DATETIME         IF (@Date1 IS NULL AND @Date2 IS NULL)       BEGIN             SET @ReturnDate = NULL             GOTO EndOfFunction       END         ELSE IF (@Date1 IS NULL AND @Date2 IS NOT NULL)       BEGIN             SET @ReturnDate = @Date2             GOTO EndOfFunction       END         ELSE IF (@Date1 IS NOT NULL AND @Date2 IS NULL)       BEGIN             SET @ReturnDate = @Date1             GOTO EndOfFunction       END         ELSE       BEGIN             SET @ReturnDate = @Date1             IF @Date2 > @Date1                   SET @ReturnDate = @Date2             GOTO EndOfFunction       END         EndOfFunction:       RETURN @ReturnDate   END -- End Function GO   ---- Set Permissions --GRANT SELECT ON LaterOf TO UserRole1 --GRANT SELECT ON LaterOf TO UserRole2 --GO                                                                                             The interesting thing about this function is its simplicity and the built-in NULL handling functionality.  Its interesting, because it seems like something should already exist in SQL Server that does this.  From a different vantage point, if you create this functionality and it is easy to use (ideally, intuitively self-explanatory), you have made a successful contribution. Interesting is good.  Self-explanatory, or intuitive is FAR better.  Happy coding! Graeme

    Read the article

  • Convert old AVI files to a modern format

    - by iWerner
    Hi, we have a collection of old home videos that were saved in AVI format a long time ago. I want to convert these files to a more modern format because the Totem Movie Player that comes with Ubuntu 10.4 seems to be the only program capable of playing them. The files seem to be encoded with a MJPEG codec, and playing them in VLC or Windows Media Player plays only the sound but there is no video. Avidemux was able to open the files, but the quality of the video is severely degraded: The video skips frames and is interlaced (it's not interlaced when playing it in Totem). Neither ffmpeg nor mencoder seems to be able to read the video stream. mencoder reports that it is using ffmpeg's codec. Here's a section from its output: ========================================================================== Opening video decoder: [ffmpeg] FFmpeg's libavcodec codec family [mjpeg @ 0x92a7260]mjpeg: using external huffman table [mjpeg @ 0x92a7260]mjpeg: error using external huffman table, switching back to internal Unsupported PixelFormat -1 Selected video codec: [ffmjpeg] vfm: ffmpeg (FFmpeg MJPEG) while running ffmpeg produces the following: $ ffmpeg -i input.avi output.avi FFmpeg version SVN-r0.5.1-4:0.5.1-1ubuntu1, Copyright (c) 2000-2009 Fabrice Bellard, et al. configuration: --extra-version=4:0.5.1-1ubuntu1 --prefix=/usr --enable-avfilter --enable-avfilter-lavf --enable-vdpau --enable-bzlib --enable-libgsm --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-pthreads --enable-zlib --disable-stripping --disable-vhook --enable-runtime-cpudetect --enable-gpl --enable-postproc --enable-swscale --enable-x11grab --enable-libdc1394 --enable-shared --disable-static libavutil 49.15. 0 / 49.15. 0 libavcodec 52.20. 1 / 52.20. 1 libavformat 52.31. 0 / 52.31. 0 libavdevice 52. 1. 0 / 52. 1. 0 libavfilter 0. 4. 0 / 0. 4. 0 libswscale 0. 7. 1 / 0. 7. 1 libpostproc 51. 2. 0 / 51. 2. 0 built on Mar 4 2010 12:35:30, gcc: 4.4.3 [avi @ 0x87952c0]non-interleaved AVI Input #0, avi, from 'input.avi': Duration: 00:00:15.24, start: 0.000000, bitrate: 22447 kb/s Stream #0.0: Video: mjpeg, yuvj422p, 720x544, 25 tbr, 25 tbn, 25 tbc Stream #0.1: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s Output #0, avi, to 'output.avi': Stream #0.0: Video: mpeg4, yuv420p, 720x544, q=2-31, 200 kb/s, 90k tbn, 25 tbc Stream #0.1: Audio: mp2, 44100 Hz, stereo, s16, 64 kb/s Stream mapping: Stream #0.0 -> #0.0 Stream #0.1 -> #0.1 Press [q] to stop encoding frame= 0 fps= 0 q=0.0 Lsize= 143kB time=15.23 bitrate= 76.9kbits/s video:0kB audio:119kB global headers:0kB muxing overhead 20.101777% So the problem is that output does not contain any video, as evidenced by the video:0kB at the end. In all of the above cases the audio comes out fine. So my question is: What can I do to convert these files to a more modern format with more modern codecs?

    Read the article

  • Ask HTG: How Can I Check the Age of My Windows Installation?

    - by Jason Fitzpatrick
    Curious about when you installed Windows and how long you’ve been chugging along without a system refresh? Read on as we show you a simple way to see how long-in-the-tooth your Windows installation is. Dear How-To Geek, It feels like it has been forever since I installed Windows 7 and I’m starting to wonder if some of the performance issues I’m experiencing have something to do with how long ago it was installed. It isn’t crashing or anything horrible, mind you, it just feels slower than it used to and I’m wondering if I should reinstall it to wipe the slate clean. Is there a simple way to determine the original installation date of Windows on its host machine? Sincerely, Worried in Windows Although you only intended to ask one question, you actually asked two. Your direct question is an easy one to answer (how to check the Windows installation date). The indirect question is, however, a little trickier (if you need to reinstall Windows to get a performance boost). Let’s start off with the easy one: how to check your installation date. Windows includes a handy little application just for the purposes of pulling up system information like the installation date, among other things. Open the Start Menu and type cmd in the run box (or, alternatively, press WinKey+R to pull up the run dialog and enter the same command). At the command prompt, type systeminfo.exe Give the application a moment to run; it takes around 15-20 seconds to gather all the data. You’ll most likely need to scroll back up in the console window to find the section at the top that lists operating system stats. What you care about is Original Install Date: We’ve been running the machine we tested the command on since August 23 2009. For the curious, that’s one month and a day after the initial public release of Windows 7 (after we were done playing with early test releases and spent a month mucking around in the guts of Windows 7 to report on features and flaws, we ran a new clean installation and kept on trucking). Now, you might be asking yourself: Why haven’t they reinstalled Windows in all that time? Haven’t things slowed down? Haven’t they upgraded hardware? The truth of the matter is, in most cases there’s no need to completely wipe your computer and start from scratch to resolve issues with Windows and, if you don’t bog your system down with unnecessary and poorly written software, things keep humming along. In fact, we even migrated this machine from a traditional mechanical hard drive to a newer solid-state drive back in 2011. Even though we’ve tested piles of software since then, the machine is still rather clean because 99% of that testing happened in a virtual machine. That’s not just a trick for technology bloggers, either, virtualizing is a handy trick for anyone who wants to run a rock solid base OS and avoid the bog-down-and-then-refresh cycle that can plague a heavily used machine. So while it might be the case that you’ve been running Windows 7 for years and heavy software installation and use has bogged your system down to the point a refresh is in order, we’d strongly suggest reading over the following How-To Geek guides to see if you can’t wrangle the machine into shape without a total wipe (and, if you can’t, at least you’ll be in a better position to keep the refreshed machine light and zippy): HTG Explains: Do You Really Need to Regularly Reinstall Windows? PC Cleaning Apps are a Scam: Here’s Why (and How to Speed Up Your PC) The Best Tips for Speeding Up Your Windows PC Beginner Geek: How to Reinstall Windows on Your Computer Everything You Need to Know About Refreshing and Resetting Your Windows 8 PC Armed with a little knowledge, you too can keep a computer humming along until the next iteration of Windows comes along (and beyond) without the hassle of reinstalling Windows and all your apps.         

    Read the article

  • Stepping outside Visual Studio IDE [Part 2 of 2] with Mono 2.6.4

    - by mbcrump
    Continuing part 2 of my Stepping outside the Visual Studio IDE, is the open-source Mono Project. Mono is a software platform designed to allow developers to easily create cross platform applications. Sponsored by Novell (http://www.novell.com/), Mono is an open source implementation of Microsoft's .NET Framework based on the ECMA standards for C# and the Common Language Runtime. A growing family of solutions and an active and enthusiastic contributing community is helping position Mono to become the leading choice for development of Linux applications. So, to clarify. You can use Mono to develop .NET applications that will run on Linux, Windows or Mac. It’s basically a IDE that has roots in Linux. Let’s first look at the compatibility: Compatibility If you already have an application written in .Net, you can scan your application with the Mono Migration Analyzer (MoMA) to determine if your application uses anything not supported by Mono. The current release version of Mono is 2.6. (Released December 2009) The easiest way to describe what Mono currently supports is: Everything in .NET 3.5 except WPF and WF, limited WCF. Here is a slightly more detailed view, by .NET framework version: Implemented C# 3.0 System.Core LINQ ASP.Net 3.5 ASP.Net MVC C# 2.0 (generics) Core Libraries 2.0: mscorlib, System, System.Xml ASP.Net 2.0 - except WebParts ADO.Net 2.0 Winforms/System.Drawing 2.0 - does not support right-to-left C# 1.0 Core Libraries 1.1: mscorlib, System, System.Xml ASP.Net 1.1 ADO.Net 1.1 Winforms/System.Drawing 1.1 Partially Implemented LINQ to SQL - Mostly done, but a few features missing WCF - silverlight 2.0 subset completed Not Implemented WPF - no plans to implement WF - Will implement WF 4 instead on future versions of Mono. System.Management - does not map to Linux System.EnterpriseServices - deprecated Links to documentation. The Official Mono FAQ’s Links to binaries. Mono IDE Latest Version is 2.6.4 That's it, nothing more is required except to compile and run .net code in Linux. Installation After landing on the mono project home page, you can select which platform you want to download. I typically pick the Virtual PC image since I spend all of my day using Windows 7. Go ahead and pick whatever version is best for you. The Virtual PC image comes with Suse Linux. Once the image is launch, you will see the following: I’m not going to go through each option but its best to start with “Start Here” icon. It will provide you with information on new projects or existing VS projects. After you get Mono installed, it's probably a good idea to run a quick Hello World program to make sure everything is setup properly. This allows you to know that your Mono is working before you try writing or running a more complex application. To write a "Hello World" program follow these steps: Start Mono Development Environment. Create a new Project: File->New->Solution Select "Console Project" in the category list. Enter a project name into the Project name field, for example, "HW Project". Click "Forward" Click “Packaging” then OK. You should have a screen very simular to a VS Console App. Click the "Run" button in the toolbar (Ctrl-F5). Look in the Application Output and you should have the “Hello World!” Your screen should look like the screen below. That should do it for a simple console app in mono. To test out an ASP.NET application, simply copy your code to a new directory in /srv/www/htdocs, then visit the following URL: http://localhost/directoryname/page.aspx where directoryname is the directory where you deployed your application and page.aspx is the initial page for your software. Databases You can continue to use SQL server database or use MySQL, Postgress, Sybase, Oracle, IBM’s DB2 or SQLite db. Conclusion I hope this brief look at the Mono IDE helps someone get acquainted with development outside of VS. As always, I welcome any suggestions or comments.

    Read the article

  • Microsoft accuse un ancien directeur d'avoir emporté des données confidentielles chez Salesforce

    Microsoft accuse un ancien directeur d'avoir emporté des données confidentielles Chez Saleforces.com et s'inquiète pour Dynamics CRM Online 2011 Microsoft accuse un de ses anciens directeurs, actuellement en fonction chez Saleforces.com, d'avoir emporté avec lui des données confidentielles de l'entreprise. Matt Miszewski, ancien directeur de la division des activités grands comptes de Microsoft, avait quitté Microsoft en Décembre pour rejoindre Salesforce.com quelques semaines après où il doit occupe le poste de vice-président du secteur public. Mais Microsoft pense que celui-ci détient des données précieuses. L'entreprise avait obtenu une injonction temporaire pour bloquer l'emba...

    Read the article

  • Moving from Silverlight 4 Beta to RC - Part 1

    The other day I had finished up my Task-It Webinar, written a few blog posts, and knew the time had come to move from my Silverlight 4 Beta environment up to the latest RC (release candidate) bits that were released last Monday. What disappointed me when I went to the Silverlight 4 Information Page is that it told me what to install, but not what to uninstall first. Uninstalling I'm not entirely sure if I had to uninstall anything, or if installing the new stuff would just work, but in poking around the web I found posts stating that you must uninstall the following items first. Unfortunately I'm going by memory here and have not been able to find my way back to the magic page I found in the myriad of posts that I went through: Microsoft VisualStudio Beta 2 Microsoft .NET Framework 4 Extended (apparently this must be done *before* the next one) Microsoft .NET Framework Client Profile Microsoft Silverlight 4 Tools for Visual Studio 2010 WCF RIA Services Preview for Visual Studio 2010 While I was at it, I removed a bunch of other stuff, like Blend 3, Blend 4, the SDK's associated with them, and a bunch other stuff that was old. Of course, I didn't really want/need to keep any Silverlight 3 stuff around as I am developing Task-It in Silverlight 4. If I need a Silverlight 3 environment at some point I'll set it up in a virtual environment. NOTE: One thing that I did not uninstall is the Microsoft Silverlight 4 Toolkit November 2009. The reason is that they have not released the March version yet, so if you uninstall this, youll end up having to reinstall it. Installing OK, now that I had all of that old stuff off my machine, now it was time to get the new stuff. For this part I liked Tim Heuer's post, A Guide to What has Changed in Silverlight 4 RC better than the Silverlight 4 Information Page. VisualStudio 2010 RC - I installed Ultimate, but you may not need that version. No harm, it's free for now anyway. I downloaded the .exe and the 3 .rar files, then ran the .exe. I then extracted the contents of the ISO (using WinZip) to a new directory, and now had Setup.exe, a bunch of .cab files and some other assorted stuff. I simply ran Setup.exe and chose custom install (only because I wanted to uncheck Visual C++...I don't really have a need for that) Silverlight 4 Tools for Visual Studio 2010 - As mentioned in Tim's blog post, this installs this installs Silverlight developer runtime, SDK, tools, and this installs Silverlight developer runtime, SDK, tools, and WCF RIA Services. WCF RIA Services Toolkit March 2010 - I'm not sure if/when I'll need any of this stuff, but no harm in installing it anyway. Expression Blend 4 beta - Only if you plan to use Blend, which I do. Windows Phone Developer Tools - Only if you are interested in playing with Windows Phone 7 development. Wrap Up Hopefully I got those steps right. If anyone finds anything I've missed, please just add a comment to this post and I'll update it accordingly.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Get to Know a Candidate (3 of 25): Virgil Goode&ndash;Constitution Party

    - by Brian Lanham
    DISCLAIMER: This is not a post about “Romney” or “Obama”. This is not a post for whom I am voting. Information sourced for Wikipedia. Meet Virgil Goode of the Constitution Party Goode was served as a Republican member of the United States House of Representatives from 1997 to 2009. He represented the 5th congressional district of Virginia. Goode was born in Richmond, Virginia, the son of Alice Clara (née Besecker) and Virgil Hamlin Goode. He has spent most of his life in Rocky Mount. Goode graduated with a B.A. from the University of Richmond (Phi Beta Kappa) and with a J.D. from the University of Virginia School of Law. He also is a member of Lambda Chi Alpha Fraternity and served in the Army National Guard from 1969 to 1975. Goode grew up as a Democrat. He entered politics soon after graduating from law school. At the age of 27, he won a special election to the state Senate from a Southside district as an independent after the death of the Democratic incumbent. One of his major campaign focuses at the time was advocacy for the Equal Rights Amendment. Soon after being elected, he joined the Democrats. Goode wore his party ties very loosely. He became famous for his support of the tobacco industry, expressing his fear that "his elderly mother would be denied 'the one last pleasure' of smoking a cigarette on her hospital deathbed." He was an ardent defender of gun rights while being an enthusiastic supporter of L. Douglas Wilder, who later became the first elected black governor in the history of the United States. At the Democratic Party's state political convention in 1985, Goode nominated Wilder for lieutenant governor. However, while governor, Wilder cracked down on the sale of guns in the state. After the 1995 elections resulted in a 20–20 split between Democrats and Republicans in the State Senate, Goode seriously considered voting with the Republicans on organizing the chamber. Had he done so, the State Senate would have been under Republican control for the first time since Reconstruction (the Republicans ultimately won control outright in 1999). Goode's actions at the time "forced his party to share power with Republican lawmakers in the state legislature," which further upset the Democratic Party. Goode is on the ballot in CA, FL, ID, IO, LA, MI, MN, MS, MI, NJ, NM, NY, NV, ND, OH, SC, SD, TN, UT, VA, WA, WI, WY.  He is a write-in candidate in CA, CT, DC, GA, IL, IN, ME, MD, MA, MO, NC, TX, VT, WV Constitution Party This party was founded as the “U.S. Taxpayers’ Party” and considers itself conservative. The party's platform is predicated on the principles of the nation's founding documents. The party puts a large focus on immigration, calling for stricter penalties towards illegal immigrants and a moratorium on legal immigration until all federal subsidies to immigrants are discontinued.The party absorbed the American Independent Party, originally founded for George Wallace's 1968 presidential campaign. The American Independent Party of California has been an affiliate of the Constitution Party since its founding; however, current party leadership is disputed and the issue is in court to resolve this conflict. The Constitution Party has some substantial support from the Christian Right and in 2010 achieved major party status in Colorado. Learn more about Virgil Goode and Constitution Party on Wikipedia.

    Read the article

  • First Windows Phone 7 &ndash; Mobile LOB App.

    - by Richard Jones
    So I spent a couple of hours yesterday building my first Windows 7 Phone Series. application.   I still can’t get used to saying Windows 7 Phone Series.   So I think I’ll just go with WP7.   I must say I’m really impressed.    Calling web-services a breeze laying out User Interface a very straight forward.   I had made called into Dynamics NAV using web-services in under 10 minutes. Working in XAML takes a bit of getting used to,  I’m not trying to-do anything too clever yet. One thing I will point out to transition from one XAML page to the next use this.NavigationService.Navigate(new Uri("/Primary.xaml", UriKind.Relative)); Going from Compact Framework this is equivalent of a Window.Show Its so nice to be able to talk nicely again, about Windows Based Mobile Development!

    Read the article

  • 2D Car Simulation with Throttle Linear Physics

    - by James
    I'm trying to make a simulation game for an automatic cruise control system. The system simulates a car on varying inclinations and throttle speeds. I've coded up to the car physics but these do note make sense. The dynamics of the simulation are specified as follows: a = V' - V T = (k1)V + ?(k2) + ma V' = (1 - (k1 / m) V) + T - ( k2 / m) * ? Where T = throttle position k1 = viscous friction V = speed V' = next speed ? = angle of incline k2 = m g sin ? a = acceleration m = mass Notice that the angle of incline in the equation is not chopped up by sin or cos. Even the equation for acceleration isn't right. Can anyone correct them or am I misinterpreting the physics?

    Read the article

  • Java EE 7 Roadmap

    - by Linda DeMichiel
    The Java EE 6 Platform, released in December 2009, has seen great uptake from the community with its POJO-based programming model, lightweight Web Profile, and extension points. There are now 13 Java EE 6 compliant appserver implementations today! When we announced the Java EE 7 JSR back in early 2011, our plans were that we would release it by Q4 2012. This target date was slightly over three years after the release of Java EE 6, but at the same time it meant that we had less than two years to complete a fairly comprehensive agenda — to continue to invest in significant enhancements in simplification, usability, and functionality in updated versions of the JSRs that are currently part of the platform; to introduce new JSRs that reflect emerging needs in the community; and to add support for use in cloud environments. We have since announced a minor adjustment in our dates (to the spring of 2013) in order to accommodate the inclusion of JSRs of importance to the community, such as Web Sockets and JSON-P. At this point, however, we have to make a choice. Despite our best intentions, our progress has been slow on the cloud side of our agenda. Partially this has been due to a lack of maturity in the space for provisioning, multi-tenancy, elasticity, and the deployment of applications in the cloud. And partially it is due to our conservative approach in trying to get things "right" in view of limited industry experience in the cloud area when we started this work. Because of this, we believe that providing solid support for standardized PaaS-based programming and multi-tenancy would delay the release of Java EE 7 until the spring of 2014 — that is, two years from now and over a year behind schedule. In our opinion, that is way too long. We have therefore proposed to the Java EE 7 Expert Group that we adjust our course of action — namely, stick to our current target release dates, and defer the remaining aspects of our agenda for PaaS enablement and multi-tenancy support to Java EE 8. Of course, we continue to believe that Java EE is well-suited for use in the cloud, although such use might not be quite ready for full standardization. Even today, without Java EE 7, Java EE vendors such as Oracle, Red Hat, IBM, and CloudBees have begun to offer the ability to run Java EE applications in the cloud. Deferring the remaining cloud-oriented aspects of our agenda has several important advantages: It allows Java EE Platform vendors to gain more experience with their implementations in this area and thus helps us avoid risks entailed by trying to standardize prematurely in an emerging area. It means that the community won't need to wait longer for those features that are ready at the cost of those features that need more time. Because we have already laid some of the infrastructure for cloud support in Java EE 7, including resource definition metadata, improved security configuration, JPA schema generation, etc., it will allow us to expedite a Java EE 8 release. We therefore plan to target the Java EE 8 Platform release for the spring of 2015. This shift in the scope of Java EE 7 allows us to better retain our focus on enhancements in simplification and usability and to deliver on schedule those features that have been most requested by developers. These include the support for HTML 5 in the form of Web Sockets and JSON-P; the simplified JMS 2.0 APIs; improved Managed Bean alignment, including transactional interceptors; the JAX-RS 2.0 client API; support for method-level validation; a much more comprehensive expression language; and more. We feel strongly that this is the right thing to do, and we hope that you will support us in this proposed direction.

    Read the article

  • How to make Ubuntu recognize an unknown external display (so I can adjust its resolution)?

    - by WagnerAA
    I have a Dell laptop with an external monitor attached (a Samsumg SyncMaster 931c). My laptop display was recognized, and I can adjust its optimum resolution. My external display is still unknown, thus I'm stuck at a lower resolution (1024x768): I tried the "Detect Displays" button, but it didn't work, nothing happens. I recently upgraded from Ubuntu 12.04 to 12.10. Things were working before. I don't know if I can actually change this configuration, or if this is a bug. I searched for an answer here and also in Launchpad's website, but found none. I even tried to install Nvidia drivers, and just messed things up. It seems I wasn't even using nvidia before, as I guessed by looking at my additional drivers configuration: My laptop has an Intel chipset, I guess: $ dpkg --get-selections | grep -i -e nvidia -e intel intel-gpu-tools install libdrm-intel1:amd64 install libdrm-intel1:i386 install nvidia-common install xserver-xorg-video-intel install I don't have an xorg.conf file (I think this is nvidia related, am I right?): $ cat /etc/X11/xorg.conf cat: /etc/X11/xorg.conf: No such file or directory $ ls -l /etc/X11/ total 76 drwxr-xr-x 2 root root 4096 Out 19 23:41 app-defaults drwxr-xr-x 2 root root 4096 Abr 25 2012 cursors -rw-r--r-- 1 root root 18 Abr 25 2012 default-display-manager drwxr-xr-x 4 root root 4096 Abr 25 2012 fonts -rw-r--r-- 1 root root 17394 Dez 3 2009 rgb.txt lrwxrwxrwx 1 root root 13 Mai 1 03:33 X -> /usr/bin/Xorg drwxr-xr-x 3 root root 4096 Out 19 23:41 xinit drwxr-xr-x 2 root root 4096 Jan 23 2012 xkb -rw-r--r-- 1 root root 0 Out 24 08:55 xorg.conf.nvidia-xconfig-original -rwxr-xr-x 1 root root 709 Abr 1 2010 Xreset drwxr-xr-x 2 root root 4096 Out 19 10:08 Xreset.d drwxr-xr-x 2 root root 4096 Out 19 10:08 Xresources -rwxr-xr-x 1 root root 3730 Jan 20 2012 Xsession drwxr-xr-x 2 root root 4096 Out 20 00:11 Xsession.d -rw-r--r-- 1 root root 265 Jul 1 2008 Xsession.options -rw-r--r-- 1 root root 13 Ago 15 06:43 XvMCConfig -rw-r--r-- 1 root root 601 Abr 25 2012 Xwrapper.config Here is some information I gathered by looking at other related posts: $ sudo lshw -C display; lsb_release -a; uname -a *-display:0 description: VGA compatible controller product: Mobile 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2 bus info: pci@0000:00:02.0 version: 07 width: 64 bits clock: 33MHz capabilities: msi pm vga_controller bus_master cap_list rom configuration: driver=i915 latency=0 resources: irq:48 memory:f6800000-f6bfffff memory:d0000000-dfffffff ioport:1800(size=8) *-display:1 UNCLAIMED description: Display controller product: Mobile 4 Series Chipset Integrated Graphics Controller vendor: Intel Corporation physical id: 2.1 bus info: pci@0000:00:02.1 version: 07 width: 64 bits clock: 33MHz capabilities: pm bus_master cap_list configuration: latency=0 resources: memory:f6100000-f61fffff LSB Version: core-2.0-amd64:core-2.0-noarch:core-3.0-amd64:core-3.0-noarch:core-3.1-amd64:core-3.1-noarch:core-3.2-amd64:core-3.2-noarch:core-4.0-amd64:core-4.0-noarch:cxx-3.0-amd64:cxx-3.0-noarch:cxx-3.1-amd64:cxx-3.1-noarch:cxx-3.2-amd64:cxx-3.2-noarch:cxx-4.0-amd64:cxx-4.0-noarch:desktop-3.1-amd64:desktop-3.1-noarch:desktop-3.2-amd64:desktop-3.2-noarch:desktop-4.0-amd64:desktop-4.0-noarch:graphics-2.0-amd64:graphics-2.0-noarch:graphics-3.0-amd64:graphics-3.0-noarch:graphics-3.1-amd64:graphics-3.1-noarch:graphics-3.2-amd64:graphics-3.2-noarch:graphics-4.0-amd64:graphics-4.0-noarch:printing-3.2-amd64:printing-3.2-noarch:printing-4.0-amd64:printing-4.0-noarch:qt4-3.1-amd64:qt4-3.1-noarch Distributor ID: Ubuntu Description: Ubuntu 12.10 Release: 12.10 Codename: quantal Linux Batcave 3.5.0-17-generic #28-Ubuntu SMP Tue Oct 9 19:31:23 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux $ xrandr -q Screen 0: minimum 320 x 200, current 2304 x 800, maximum 32767 x 32767 LVDS1 connected 1280x800+0+0 (normal left inverted right x axis y axis) 286mm x 1790mm 1280x800 59.9*+ 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 connected 1024x768+1280+32 (normal left inverted right x axis y axis) 0mm x 0mm 1024x768 60.0* 800x600 60.3 56.2 848x480 60.0 640x480 59.9 DP1 disconnected (normal left inverted right x axis y axis) If there's anything else I can do, any other information I can post here, to help me configure this external display, please let me know. If this is actually a bug, I apologize (I know bugs are not allowed here), but I really wasn't sure. And I will promptly file a bug report in Launchpad if that's the case. Thanks a lot in advance. ;)

    Read the article

  • Merging Social Accounts: What We Learned This Weekend

    - by Mike Stiles
    Guest Post by Erika BrookesWe learned that it’s not always as easy as you think it’s going to be. While it’s widely accepted that merging multiple owned Facebook Pages that are duplicating communities and putting out the same type of content is a best practice, actually pulling it off without rattling fans is a trickier proposition. Facebook is nice and clear about how to merge Facebook Pages. Although content is not carried over, Likes from the pages you’re merging are. So you can imagine the surprise when such fans start seeing posts in their News Feed from a page they don’t believe they ever Liked. One community member accurately likened it to having your bank come under another bank’s brand name. The Facebook Page changes to the new brand, just like your debit card, emails, signs and other communication. This weekend we did our merge. The Facebook communities of Vitrue, Involver and Collective Intellect were pulled into one community, Oracle Social. Could we have handled it better? Oh yeah. Our intent was to make sure, to the fullest extent possible, that the fans of the Vitrue, Involver, and Collective Intellect brand pages were well-informed about the pending page merges in ADVANCE of the merge. While many were aware that Oracle acquired the three companies, many were not. We learned from fan feedback that we should have sent notifications MUCH earlier to make the brand Page merge crystal clear and to answer any questions. That was our bad, our responsibility and we apologize for Oracle Social showing up in your News Feed if you were not aware that it was a result of your fandom of Vitrue, Involver or Collective Intellect. It was our job to make you aware well in advance. Some felt they had never Liked the fan Pages of Vitrue, Involver or Collective Intellect, so they were understandably upset (some cultures may call it “fit to be tied”) when they found themselves fans of Oracle Social. One thing to consider is that since 2009, brands and developers have used and enjoyed free Involver tab apps like Twitter, RSS and YouTube (1.2 million of which are currently active), which included an opt-in Liking the Involver Page. Often, when Liking happens in a manner outside of the traditional clicking of a Like button on a brand Page, it’s easy to forget a Page was indeed Liked. Lastly, a few felt that their Like of the Page had been “bought.” It was not. No fans or Likes were separately purchased. Yes, the companies and the social properties of Vitrue, Involver and Collective Intellect were acquired by Oracle. Those brands are now being coordinated into the larger Oracle brand. In social media, that means those brands are being integrated into the Oracle Social community. So what now? We apologize and apply lessons learned. We learned that you not only have to communicate thoroughly and clearly, but you have to communicate well in advance of any actionable items that will affect fans. We’re more than willing to walk straight to the woodshed when we deserve it. Going forward, the social team here is dedicated to facilitating content, discussion and sharing around social for marketers, agencies, IT stakeholders and social staffs, including community managers. We anticipate Oracle Social being the premier gathering place for true social innovators as we move into social’s exciting next phase of development. Inevitably, some will still feel they are fans of the Page in error. While we hate to see you go, you may unlike the Page if it’s not relevant or useful to you. Let’s continue to contribute, participate, foster our desire to learn, and move forward together positively and constructively - both for current fans of the community and the many fans to come.

    Read the article

  • Starting over and new to Ubuntu

    - by 2funnyyone
    We have been having repeated problems with our interent service and using windows xp & sp3 (users and premissions) I see no need for them. I started with computers long before windows. Every since sp 3 come out in 2009 I have had nothing but problems. I have lost so many computers to virius and trojans, we just stack them up. We are with Qwest/ Century link which is using advertising servers which I think is causing the problem. All the computers are networked together which is not how I set them up. I beleive Century link is networking them through assignment of a domain for our home. This causes all the computers to crash twice. This is getting expensive. We tried buying new harddrives but reinfect with hours of connecting to internet. I also beleive the modem, router and all computers are infected. I put combofix on this one and that is the only reason we are still online with this laptop. I am afraid to install new equipment because my partner and I are on SSDI and this cost a lot. I go to school at UOP and had to run off a flash and reboot this laptop to recovery every other day or so, this pass month. New plan is: We are getting ready to install new equipment but afraid to reinfect again. Need help to install new equipment. The plan is to use current internet services from Qwest/ now Century Link. The list of New equipment in order: Century link wireless modem is ZyXEL PK5000Z with 4 direct connect Ethernet ports Next Dell Optiplex 210L ( used auction purchase ) 2 gb ram 80 g hard drive Ubuntu 11.10 operating system Next Wireless D-Link router WBR-1310 with 4 direct connect Ethernet ports OK-------- Purchased Dell OEM disk for Repair or Reinstalling Windows XP Professional Operating system (2 roommates as well) All infected computers are Dell desktops or laptops with XP Pro Also purchasing Ubuntu 12.04 for 3 computers. We like the way it runs but still learning it. Questions 1] How do we fdisk the infected computers without infecting new system. We have Dos disks, but none have floppy dish drive. We do have a new floppy disk drive and usb adapter we purchased from Amazon. 2] We are thinking Avast internet security because of the boot scan. We want all software loaded before reconnecting. We can manually load our internet provider information. We purchased StopZilla $100 for 5 computers, but not sure that is what we need. But need how to setup ports security and services we will need. Really lost at this part. So we are safe when we go back on the internet. 3] Want to connect reloaded fdisk systems to router as public connection and no sharing. Do not want to network all computers. 4] Want parental/ ownership control from Ubuntu system for internet connection (Children and friends). Do we restrict at the modem and/ or router? Any help would be a blessing. I do not want to go alone on this anymore.

    Read the article

  • What are the common character animation techniques used in tile based hack&slash games?

    - by Gorky
    I wonder what kind of animation techniques are used for creature and character animation in modern hack&slash type tile based games? Keyframing for different actions may be one option. Skeletal framing may be another. But how about the physics? Or do they use a totally hybrid system of inverse kinematics supported with a skeleton,physics and mixed with interpolated keyframing for more realistic animations? If so, how and for what reasons? I can think of many different solutions for the issues below but I wonder what's used and best suited for issues like: Walking or moving on an uneven terrain Combat interaction, combat physics and collisions Attaching rigid items to character and their iteractions ih physics world Soft body dynamics like hair, vegetation, clothes and fabric in line with animations and iteractions.

    Read the article

  • lvm disappeared after disc replacement on raid10

    - by user142295
    here my problem: I am running ubuntu 12.04 on a raid10 (4 disks), on top of which I installed an lvm with two volume groups (one for /, one for /home). The layout of the disks are as follows: Disk /dev/sda: 1500.3 GB, 1500301910016 bytes 255 heads, 63 sectors/track, 182401 cylinders, total 2930277168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x0003f3b6 Device Boot Start End Blocks Id System /dev/sda1 * 63 481949 240943+ 83 Linux /dev/sda2 481950 2910640634 1455079342+ fd Linux raid autodetect /dev/sda3 2910640635 2930272064 9815715 82 Linux swap / Solaris Disk /dev/sdb: 1500.3 GB, 1500301910016 bytes 255 heads, 63 sectors/track, 182401 cylinders, total 2930277168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00069785 Device Boot Start End Blocks Id System /dev/sdb1 63 2910158684 1455079311 fd Linux raid autodetect /dev/sdb2 2910158685 2930272064 10056690 82 Linux swap / Solaris Disk /dev/sdc: 1500.3 GB, 1500301910016 bytes 255 heads, 63 sectors/track, 182401 cylinders, total 2930277168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x00000000 Device Boot Start End Blocks Id System /dev/sdc1 63 2910158684 1455079311 fd Linux raid autodetect /dev/sdc2 2910158685 2930272064 10056690 82 Linux swap / Solaris Disk /dev/sdd: 1500.3 GB, 1500301910016 bytes 255 heads, 63 sectors/track, 182401 cylinders, total 2930277168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0x000f14de Device Boot Start End Blocks Id System /dev/sdd1 63 2910158684 1455079311 fd Linux raid autodetect /dev/sdd2 2910158685 2930272064 10056690 82 Linux swap / Solaris The first disk (/dev/sda) contains the /boot partition on /dev/sda1. I use grub2 to boot the system off this partition. On top of this raid10 I installed two volume groups, one for /, one for /home. This system worked well, I even exchanged two disks during the last two years. It always worked. But not this time. For the first time, /dev/sda broke. I do not know if this is an issue – I know I would have struggled anyways to overcome the problem with /boot installed on that disk and grub2 installed on the mbr of /dev/sda. Anyways, I did what I always did: start knoppix fire up the raid sudo mdadm --examine -scan which returns ARRAY /dev/md127 UUID=0dbf4558:1a943464:132783e8:19cdff95 start it up sudo mdadm --assemble /dev/md127 fail the failing disk (smart event) sudo mdadm /dev/md127 --fail /dev/sda2 remove the failing disk sudo mdadm /dev/md127 --remove /dev/sda2 stop the raid sudo mdadm -S /dev/md127 take out the disk replace it with a new one create the same partitions as on the failling one add it to the raid sudo mdadm --assemble /dev/md127 sudo mdadm /dev/md127 --add /dev/sda2 wait 4 hours All looks fine: cat /proc/mdstat returns: Personalities : [raid10] md127 : active raid10 sda2[0] sdd1[3] sdc1[2] sdb1[1] 2910158464 blocks 64K chunks 2 near-copies [4/4] [UUUU] unused devices: <none> and sudo mdadm --detail /dev/md127 returns /dev/md127: Version : 0.90 Creation Time : Wed Jun 10 13:08:46 2009 Raid Level : raid10 Array Size : 2910158464 (2775.34 GiB 2980.00 GB) Used Dev Size : 1455079232 (1387.67 GiB 1490.00 GB) Raid Devices : 4 Total Devices : 4 Preferred Minor : 127 Persistence : Superblock is persistent Update Time : Thu Mar 21 16:27:40 2013 State : clean Active Devices : 4 Working Devices : 4 Failed Devices : 0 Spare Devices : 0 Layout : near=2 Chunk Size : 64K UUID : 0dbf4558:1a943464:132783e8:19cdff95 (local to host Microknoppix) Events : 0.4824680 Number Major Minor RaidDevice State 0 8 2 0 active sync /dev/sda2 1 8 17 1 active sync /dev/sdb1 2 8 33 2 active sync /dev/sdc1 3 8 49 3 active sync /dev/sdd1 However, there is no trace of the volume groups. Rebooting into knoppix does not help Restarting the old system (I actually replugged and re-added the failing disk for that – the system begins to start, but then fails to see the / partition – no wonder if the volume group is gone) does not help. sudo vgscan, sudo vgdisplay, sudo lvs, sudo lvdisplay, sudo vgscan –mknodes all returned No volume groups found. I am completely at a loss. Can anyone tell me if and how I can recover my data? Thanks in advance!

    Read the article

  • BI&EPM in Focus Oct 2012

    - by Mike.Hallett(at)Oracle-BI&EPM
    Customers Iluka Resources Improves Business Insight into Mining Operations Through Significantly Faster, Customized Analyses Banco do Brasil Monitors Budgets in Real Time, Generates Financial Reports In Minutes Instead of Months General Dynamics Improves Budgeting and Planning and Accelerates Rate Changes by Using Integrated Enterprise Performance Management Suite Facebook achieves world-wide automation of financial close task tracking and management of account reconciliations with Oracle Hyperion Financial Close Management (link) Hess Consolidates Multiple SAP General Ledgers with Oracle Hyperion (link) Navistar Leads with Cutting Edge Hyperion Platform, Including HSF, HPCM (link)   Enterprise Performance Management Oct 10: Navistar Leverages DRM (Rolta Solutions) (link) Replay: Integrated Business Planning, Featuring Leggett & Platt (link)   Business Intelligence Report: From Overload to Impact: An Industry Scorecard on Big Data Business Challenges (link | press release) Oct 10: The Top Five Things You Should Know When Migrating from an Old BI Technology to Oracle Business Intelligence Enterprise Edition (perfomance architects) (link)

    Read the article

  • Move over DFS and Robocopy, here is SyncToy!

    - by andywe
    Ever since Windows 2000, I have always had the need to replicate data to multiple endpoints with the same content. Until DFS was introduced, the method of thinking was to either manually copy the data location by location, or to batch script it with xcopy and schedule a task. Even though this worked (and still does today), it was cumbersome, and intensive on the network, especially when dealing with larger amounts of data. Then along came robocopy, as an internal tool written by an enterprising programmer at Microsoft. We used it quite a bit, especially when we could not use DFS in the early days. It was received so well, it made it into the public realm. At least now we could have the ability to determine what files had changed and only replicate those. Well, over time there has been evolution of this ideal. DFS is obviously the Windows enterprise class service to do this, along with BrancheCache..however you don’t always need or want the power of DFS, especially when it comes to small datacenter installations, or remote offices. I have specific data sets that are on closed or restricted networks, that either have a security need for this, or are in remote countries where bandwidth is a premium. FOr this, I use the latest evolution for one off replication names Synctoy. Synctoy is from Microsoft, seemingly released in 2009, that wraps a nice GUI around setting up a paired set of folders (remember the mobile briefcase from Windows 98?), and allowing you the choice of synchronization methods. 1 way, or 2 way. Simply create a paired set of folders on the source and destination, choose your options for content, exclude any file types you don’t want to replicate, and click run. Scheduling is even easier. MS has included a wrapper for doing just this so all you enter in your task schedule in the SynToyCMD.exe, a –R as an argument, and the time schedule. No more complicated command lines or scripts.   I find this especially useful when I use MS backup to back up a system volume, but only want subsets of backup information of a data share and ONLY when that dataset has changed. Not relying on full backups and incremental. An example of this is my application installation master share. I back this up with SyncToy because I do not need multiple backup copies..one copy elsewhere suffices to back it up. At home, very useful for your pictures, videos, music, ect..the backup is online and ready to access, not waiting for you to restore a backup file, and no need to institute a domain simply to have DFS.'   Do note there is a risk..if you accidently delete a file and do not catch this before the next sync, then depending on your SyncToy settings, you can indeed lose that file as the destination updates..so due diligence applies. I make it a rule to sync manly one way…I use my master share for making changes, and allow the schedule to follow suit. Any real important file I lock down as read only through file permissions so it cannot be deleted unless I intervene.   Check out the tool and have some fun! http://www.microsoft.com/en-us/download/details.aspx?DisplayLang=en&id=15155

    Read the article

  • Clean SOAP Calls from iOS - SudzC

    - by Richard Jones
    This is worth another mention. If you need to call SOAP web-services from iOS or Javascript, and lets face who doesn't. http://SudzC.com really delivers. You give it the URL to you're WSDL file (or upload a file) and it just spits out a ready to go Xcode project. I would point out that to get it to work 100% I changed line 204, in Soap.m (commented out line is old version, mine is below) //if([child respondsToSelector:@selector(name)] && [[child name] isEqual: name]) { if([child respondsToSelector:@selector(name)] && [[child name] hasSuffix: name]) { I consumed a Microsoft Dynamics NAV set of web-service pages no problem (and they tend to be fairly complex WSDL definitions).

    Read the article

  • TFS 2010 and Expanding Possibilities

    - by ssmantha
    My organization is migrating to the new Team Foundation Server 2010, and we all are busy experimenting with ALM features. I find it very exciting and I foresee quite a lot of possibilities for TFS 2010 getting integrated with the Dynamics product line. I hope most of us know about the Version Control capabilities. However, I see a strong possibilities of having Team Builds and other automations, thanks to the technologies like workflow foundation in .net which is now being actively used in various TFS scenarios.   Lots of experiments so there is more to come!!

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >