Search Results

Search found 10947 results on 438 pages for 'product comparison'.

Page 99/438 | < Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >

  • Hyper-V performance comparisons vs physical client?

    - by rwmnau
    Are there any comparisons between Hyper-V client machines and their physical equivalent? I've looked around and can find 4000 articles about improving Hyper-V performance, but I can't find any that actually do a side-by-side comparison or give benchmarking numbers. Ideally, I'm interested in a comparison of CPU, memory, disk, and graphics performance between something like the following: Some powerful workstation (with plenty of RAM) with Windows 7 installed on it directly Same exact worksation with Hyper-V Server 2008 R2 (the bare Server role) and a full-screen Windows 7 client machine Virtual Server 2005 had performance that didn't compare at all with actual hardware, but with the advances in CPU and hardware-level virtualization, has performance improved significantly? How obvious would it be to a user of the two above scenarios that one of them was virtualized, and does anybody know of actual benchmarking of this type?

    Read the article

  • Installing Sql Server 2005 SP2 - Getting error on analysis services component

    - by Greg_the_Ant
    At first many of the components didn't install and I followed this workaround (fixing user/SID mappings in registry.) After that everything installed successfully except for analysis services. I am getting the exact same error message as before on analysis services. (Are there other users installed by sql server I'm not aware of perhaps?) Do you guys have any ideas? All of my searches seem to just point to that workaround above that I already did. Error message from log: Product : Analysis Services (MSSQLSERVER) Product Version (Previous): 1399 Product Version (Final) : Status : Failure Log File : C:\Program Files\Microsoft SQL Server\90\Setup Bootstrap\LOG\Hotfix\OLAP9_Hotfix_KB921896_sqlrun_as.msp.log Error Number : 29528 Error Description : MSP Error: 29528 The setup has encountered an unexpected error while Setting Internal Properties. The error is: Fatal error during installation.

    Read the article

  • Is computer's DRAM size not as important once we get a Solid State Drive?

    - by Jian Lin
    I am thinking of getting a Dell X11 netbook, and it can go up to 8GB of DRAM, together with a 256GB Solid State Drive. So in that case, it can handle quite a bit of Virtual PC running Linux, and Win XP, etc. But is the 8GB of RAM not so important any more. Won't 2GB or 4GB be quite good if a Solid State Hard drive is used? I think the most worried thing is that the memory is not enough and the less often used data is swapped to the pagefile on hard disk and it will become really slow, but with SDD drive, the problem is a lot less of a concerned? Is there a comparison as to, if DRAM speed is n, then SDD drive speed is how many n and hard disk speed is how many n just as a ball park comparison?

    Read the article

  • How can I to take an HDMI TV broadcast and overlay text in real time?

    - by ObligatoryMoniker
    Our company wants to be able to have LCD TVs displaying TV with the ability to add an overlay, like a stock ticker at the bottom of the screen, where human resources can add content to be displayed. I have been trying to nail down the correct terminology for this and come across terms like Keying, Compositing, Live Broadcast Graphics Presentation, and Hardware Overlay but I don't know which of these terms is truly the correct way to refer to what I am trying to do. Black Magic offers a product that seems like it can do what I am looking for but their product seems like it is geared for a totally different purpose than what I would be using it for. Compix also seems to have a product that would do what I need but again it seems like killing a fly with a sledge hammer. How can I to take an HDMI TV broadcast and overlay arbitrary content in real time?

    Read the article

  • Is a computer's DRAM size not as important once we get a Solid State Drive?

    - by Jian Lin
    I am thinking of getting a Dell X11 netbook, and it can go up to 8GB of DRAM, together with a 256GB Solid State Drive. So in that case, it can handle quite a bit of Virtual PC running Linux, and Win XP, etc. But is the 8GB of RAM not so important any more? Won't 2GB or 4GB be quite good if a Solid State Drive is used? I think the most worrying thing is that the memory is not enough and the less often used data is swapped to the pagefile on hard disk and it will become really slow, but with an SSD drive, the problem is a lot less of a concern? Is there a comparison as to, if DRAM speed is n, then SSD drive speed is how many n and hard disk speed is how many n just as a ball park comparison?

    Read the article

  • Would requiring .Net 4.0 act as a bar to adopting our software in a corporate environment?

    - by Sam
    We are developing a software product in .Net targeting large corporates. The product has both server and desktop client components. We anticipate that our product will be used by a small subset of workers in the corporation - probably those in the Finance function. We currently require .Net 3.5 but are considering moving to .Net 4. Could anyone with experience of managing IT in such an environment tell us whether requiring .Net 4.0 at this stage would be a bar to adopting our software? What attitudes prevail regarding the use of frameworks like .Net?

    Read the article

  • Shell script to read value from a file and compare it to another one

    - by maneeshshetty
    I have a C program which puts one unique value inside a test file (it would be a two digit number). Now I want to run a shell script to read that number and then compare with my required number (e.g. 40). The comparison should deliver "equal to" or "greater". For example: The output of the C program is written into the file called c.txt with the value 36, and I want to compare it with the number 40. So I want that comparison to be "equal to" or "greater" and then echo the value "equal" or "greater".

    Read the article

  • Token replacement

    - by ClarkeyBoy
    Hey, I currently have a system on my website whereby I can put something like "[cfe]" anywhere in the site and, when the page is rendered, it will replace it with the root to the customer front end (same for "[afe]" and admin front end - so in the admin front end I can put "[cfe]/Default.aspx" to link to the homepage on the customer front end. This is in place as I have a development version of the site, then a test and a live version too. All 3 may have different roots to each section (for example the way the website is set up, the root to the admin front end in test is "/test/Administration/", but in live and development it is just "/Administration/"). Which version it is depends on the URL - all my development sites are in a folder called "development", whereas test is in a folder called "test" and any live urls do not contain either of these. There are also 3 different databases - one for each. All 3, obviously, require a different connection string. I also have a string replacement function in place which can change, for example, "[Product:Cards]" to point to the Cards catalogue page. Problem is that for this I go through all the products and do a replacement on "[Product:" & Product.Name() & "]". However I would like to take this further. I would like to pick up these custom strings when the page is rendered so it picks up "[Product:Cards]" and then goes off to find product "Cards" and replaces the string with a link to the Cards page, rather than looping through all the products and doing a replace just on the off chance that there are any replacements to make. One use for this, which I may start using in the future if I can figure out how to do this, is like on Wikipedia where you put the title of the page you want to point to, then a divider (think its a pipe from memory) then the link text. I would like to apply this to the above situation. This way broken links can also be picked up, and reported to admin (a major advantage as they can then locate them and remove the link or add the product / page that it refers to). I would like to take this to the stage where content of entire pages can be rearranged (kinda like web parts, but not as advanced as that). I mean like so you can put [layout type="3columnImageTopRight" image="imageurl"]Content here[/layout]. This will display, as specified, an image in the top right (with padding at the left and bottom) and 3 columns - maybe with the image spanning one or two columns). The imageurl can be specified as another token: maybe like [Image:imagename.gif] or something. This replaces it with the root to the image folder and then the specified filename. I have not really looked into how I am going to split the content into 3 columns yet, but this would be something to look at for my dissertation and then implement after my project deadline at least. Does anyone have any ideas or pointers which could help me with this? Also if this is not strictly token replacement then please point me to what it is, so I can further develop this. Thanks in advance, Regards, Richard Clarke

    Read the article

  • Const references when dereferencing iterator on set, starting from Visual Studio 2010

    - by Patrick
    Starting from Visual Studio 2010, iterating over a set seems to return an iterator that dereferences the data as 'const data' instead of non-const. The following code is an example of something that does compile on Visual Studio 2005, but not on 2010 (this is an artificial example, but clearly illustrates the problem we found on our own code). In this example, I have a class that stores a position together with a temperature. I define comparison operators (not all them, just enough to illustrate the problem) that only use the position, not the temperature. The point is that for me two instances are identical if the position is identical; I don't care about the temperature. #include <set> class DataPoint { public: DataPoint (int x, int y) : m_x(x), m_y(y), m_temperature(0) {} void setTemperature(double t) {m_temperature = t;} bool operator<(const DataPoint& rhs) const { if (m_x==rhs.m_x) return m_y<rhs.m_y; else return m_x<rhs.m_x; } bool operator==(const DataPoint& rhs) const { if (m_x!=rhs.m_x) return false; if (m_y!=rhs.m_y) return false; return true; } private: int m_x; int m_y; double m_temperature; }; typedef std::set<DataPoint> DataPointCollection; void main(void) { DataPointCollection points; points.insert (DataPoint(1,1)); points.insert (DataPoint(1,1)); points.insert (DataPoint(1,2)); points.insert (DataPoint(1,3)); points.insert (DataPoint(1,1)); for (DataPointCollection::iterator it=points.begin();it!=points.end();++it) { DataPoint &point = *it; point.setTemperature(10); } } In the main routine I have a set to which I add some points. To check the correctness of the comparison operator, I add data points with the same position multiple times. When writing the contents of the set, I can clearly see there are only 3 points in the set. The for-loop loops over the set, and sets the temperature. Logically this is allowed, since the temperature is not used in the comparison operators. This code compiles correctly in Visual Studio 2005, but gives compilation errors in Visual Studio 2010 on the following line (in the for-loop): DataPoint &point = *it; The error given is that it can't assign a "const DataPoint" to a [non-const] "DataPoint &". It seems that you have no decent (= non-dirty) way of writing this code in VS2010 if you have a comparison operator that only compares parts of the data members. Possible solutions are: Adding a const-cast to the line where it gives an error Making temperature mutable and making setTemperature a const method But to me both solutions seem rather 'dirty'. It looks like the C++ standards committee overlooked this situation. Or not? What are clean solutions to solve this problem? Did some of you encounter this same problem and how did you solve it? Patrick

    Read the article

  • How can I display the clicked products by user on a list in another view?

    - by Avar
    I am using MVC3 Viewmodel pattern with Entity Framework on my webbapplication. My Index View is list of products with image, price and description and etc. Products with the information I mentioned above is in div boxes with a button that says "buy". I will be working with 2 views one that is the Index View that will display all the products and the other view that will display the products that got clicked by the buy button. What I am trying to achieve is when a user click on buy button the products should get stored in the other view that is cart view and be displayed. I have problems on how to begin the coding for that part. The index View with products is done and now its the buy button function left to do but I have no idea how to start. This is my IndexController: private readonly HomeRepository repository = new HomeRepository(); public ActionResult Index() { var Productlist = repository.GetAllProducts(); var model = new HomeIndexViewModel() { Productlist = new List<ProductsViewModel>() }; foreach (var Product in Productlist) { FillProductToModel(model, Product); } return View(model); } private void FillProductToModel(HomeIndexViewModel model, ProductImages productimage) { var productViewModel = new ProductsViewModel { Description = productimage.Products.Description, ProductId = productimage.Products.Id, price = productimage.Products.Price, Name = productimage.Products.Name, Image = productimage.ImageUrl, }; model.Productlist.Add(productViewModel); } In my ActionResult Index I am using my repository to get the products and then I am binding the data from the products to my ViewModel so I can use the ViewModel inside my view. Thats how I am displaying all the products in my View. This is my Index View: @model Avan.ViewModels.HomeIndexViewModel @foreach (var item in Model.Productlist) { <div id="productholder@(item.ProductId)" class="productholder"> <img src="@Html.DisplayFor(modelItem => item.Image)" alt="" /> <div class="productinfo"> <h2>@Html.DisplayFor(modelItem => item.Name)</h2> <p>@Html.DisplayFor(modelItem => item.Description)</p> @Html.Hidden("ProductId", item.ProductId, new { @id = "ProductId" }) </div> <div class="productprice"> <h2>@Html.DisplayFor(modelItem => item.price)</h2> <input type="button" value="Läs mer" class="button" id="button@(item.ProductId)"> @Html.ActionLink("x", "Cart", new { id = item.ProductId }) // <- temp its going to be a button </div> </div> } Since I can get the product ID per product I can use the ID in my controller to get the data from the database. But I still I have no idea how I can do that so when somebody click on the buy button I store the ID where? and how do I use it so I can achieve what I want to do? Right now I have been trying to do following thing in my IndexController: public ActionResult cart(int id) { var SelectedProducts = repository.GetProductByID(id); return View(); } What I did here is that I get the product by the id. So when someone press on the temp "x" Actionlink I will recieve the product. All I know is that something like that is needed to achieve what im trying to do but after that I have no idea what to do and in what kind of structure I should do it. Any kind of help is appreciated alot! Short Scenario: looking at the Index I see 5 products, I choose to buy 3 products so I click on three "Buy" buttons. Now I click on the "Cart" that is located on the nav menu. New View pops up and I see the three products that I clicked to buy.

    Read the article

  • Same source, multiple targets with different resources (Visual Studio .Net 2008)

    - by Mike Bell
    A set of software products differ only by their resource strings, binary resources, and by the strings / graphics / product keys used by their Visual Studio Setup projects. What is the best way to create, organize, and maintain them? i.e. All the products essentially consist of the same core functionality customized by graphics, strings, and other resource data to form each product. Imagine you are creating a set of products like "Excel for Bankers", Excel for Gardeners", "Excel for CEOs", etc. Each product has the the same functionality, but differs in name, graphics, help files, included templates etc. The environment in which these are being built is: vanilla Windows.Forms / Visual Studio 2008 / C# / .Net. The ideal solution would be easy to maintain. e.g. If I introduce a new string / new resource projects I haven't added the resource to should fail at compile time, not run time. (And subsequent localization of the products should also be feasible). Hopefully I've missed the blindingly-obvious and easy way of doing all this. What is it? ============ Clarification(s) ================ By "product" I mean the package of software that gets installed by the installer and sold to the end user. Currently I have one solution, consisting of multiple projects, (including a Setup project), which builds a set of assemblies and create a single installer. What I need to produce are multiple products/installers, all with similar functionality, which are built from the same set of assemblies but differ in the set of resources used by one of the assemblies. What's the best way of doing this? ------------ The 95% Solution ----------------- Based upon Daminen_the_unbeliever's answer, a resource file per configuration can be achieved as follows: Create a class library project ("Satellite"). Delete the default .cs file and add a folder ("Default") Create a resource file in the folder "MyResources" Properties - set CustomToolNamespace to something appropriate (e.g. "XXX") Make sure the access modifier for the resources is "Public". Add the resources. Edit the source code. Refer to the resources in your code as XXX.MyResources.ResourceName) Create Configurations for each product variant ("ConfigN") For each product variant, create a folder ("VariantN") Copy and Paste the MyResources file into each VariantN folder Unload the "Satellite" project, and edit the .csproj file For each "VariantN/MyResources" <Compile> or <EmbeddedResource> tag, add a Condition="'$(Configuration)' == 'ConfigN'" attribute. Save, Reload the .csproj, and you're done... This creates a per-configuration resource file, which can (presumably) be further localized. Compile error messages are produced for any configuration that where a a resource is missing. The resource files can be localized using the standard method (create a second resources file (MyResources.fr.resx) and edit .csproj as before). The reason this is a 95% solution is that resources used to initialize forms (e.g. Form Titles, button texts) can't be easily handled in the same manner - the easiest approach seems to be to overwrite these with values from the satellite assembly.

    Read the article

  • When should I use Areas in TFS instead of Team Projects

    - by Martin Hinshelwood
    Well, it depends…. If you are a small company that creates a finite number of internal projects then you will find it easier to create a single project for each of your products and have TFS do the heavy lifting with reporting, SharePoint sites and Version Control. But what if you are not… Update 9th March 2010 Michael Fourie gave me some feedback which I have integrated. Ed Blankenship via @edblankenship offered encouragement and a nice quote. Ewald Hofman gave me a couple of Cons, and maybe a few more soon. Ewald’s company, Avanade, currently uses Areas, but it looks like the manual management is getting too much and the project is getting cluttered. What if you are likely to have hundreds of projects, possibly with a multitude of internal and external projects? You might have 1 project for a customer or 10. This is the situation that most consultancies find themselves in and thus they need a more sustainable and maintainable option. What I am advocating is that we should have 1 “Team Project” per customer, and use areas to create “sub projects” within that single “Team Project”. "What you describe is what we generally do internally and what we recommend. We make very heavy use of area path to categorize the work within a larger project." - Brian Harry, Microsoft Technical Fellow & Product Unit Manager for Team Foundation Server   "We tend to use areas to segregate multiple projects in the same team project and it works well." - Tiago Pascoal, Visual Studio ALM MVP   "In general, I believe this approach provides consistency [to multi-product engagements] and lowers the administration and maintenance costs. All good." - Michael Fourie, Visual Studio ALM MVP   “@MrHinsh BTW, I'm very much a fan of very large, if not huge, team projects in TFS. Just FYI :) Use Areas & Iterations.” Ed Blankenship, Visual Studio ALM MVP   This would mean that SSW would have a single Team Project called “SSW” that contains all of our internal projects and consequently all of the Areas and Iteration move down one hierarchy to accommodate this. Where we would have had “\SSW\Sprint 1” we now have “\SSW\SqlDeploy\Sprint1” with “SqlDeploy” being our internal project. At the moment SSW has over 70 internal projects and more than 170 total projects in TFS. This method has long term benefits that help to simplify the support model for companies that often have limited internal support time and many projects. But, there are implications as TFS does not provide this model “out-of-the-box”. These implications stretch across Areas, Iterations, Queries, Project Portal and Version Control. Michael made a good comment, he said: I agree with your approach, assuming that in a multi-product engagement with a client, they are happy to adopt the same process template across all products. If they are not, then it’ll either be easy to convince them or there is a valid reason for having a different template - Michael Fourie, Visual Studio ALM MVP   At SSW we have a standard template that we use and this is applied across the board, to all of our projects. We even apply any changes to the core process template to all of our existing projects as well. If you have multiple projects for the same clients on multiple templates and you want to keep it that way, then this approach will not work for you. However, if you want to standardise as we have at SSW then this approach may benefit you as well. Implications around Areas Areas should be used for topological classification/isolation of work items. You can think of this as architecture areas, organisational areas or even the main features of your application. In our scenario there is an additional top level item that represents the Project / Product that we want to chop our Team Project into. Figure: Creating a sub area to represent a product/project is easy. <teamproject> <teamproject>\<Functional Area/module whatever> Becomes: <teamproject> <teamproject>\<ProjectName>\ <teamproject>\<ProjectName>\<Functional Area/module whatever> Implications around Iterations Iterations should be used for chronological classification/isolation of work items. This could include isolated time boxes, milestones or release timelines and really depends on the logical flow of your project or projects. Due to the new level in Area we need to add the same level to Iteration. This is primarily because it is unlikely that the sprints in each of your projects/products will start and end at the same time. This is just a reality of managing multiple projects. Figure: Adding the same Area value to Iteration as the top level item adds flexibility to Iteration. <teamproject>\Sprint 1 Or <teamproject>\Release 1\Sprint 1 Becomes: <teamproject>\<ProjectName>\Sprint 1 Or <teamproject>\<ProjectName>\Release 1\Sprint 1 Implications around Queries Queries are used to filter your work items based on a specified level of granularity. There are a number of queries that are built into a project created using the MSF Agile 5.0 template, but we now have multiple projects and it would be a pain to have to edit all of the work items every time we changed project, and that would only allow one team to work on one project at a time.   Figure: The Queries that are created in a normal MSF Agile 5.0 project do not quite suit our new needs. In order for project contributors to be able to query based on their project we need a couple of things. The first thing I did was to create an “_Area Template” folder that has a copy of the project layout with all the queries setup to filter based on the “_Area Template” Area and the “_Sprint template” you can see in the Area and Iteration views. Figure: The template is currently easily drag and drop, but you then need to edit the queries to point at the right Area and Iteration. This needs a tool. I then created an “Areas” folder to hold all of the area specific queries. So, when you go to create a new TFS Sub-Project you just drag “_Area Template” while holding “Ctrl” and drop it onto “Areas”. There is a little setup here. That said I managed it in around 10 minutes which is not so bad, and I can imagine it being quite easy to build a tool to create these queries Figure: These new queries can be configured in around 10 minutes, which includes setting up the Area and Iteration as well. Version Control What about your source code? Well, that is the easiest of the lot. Just create a sub folder for each of your projects/products.   Figure: Creating sub folders in source control is easy as “Right click | Create new folder”. <teamproject>\DEV\Main\ Becomes: <teamproject>\<ProjectName>\DEV\Main\ Conclusion I think it is up to each company to make a call on how you want to configure your Team Projects and it depends completely on how many projects/products you are going to have for each customer including yourself. If we decide to utilise this route it will require some configuration to get our 170+ projects into this format, and I will probably be writing some tools to help. Pros You only have one project to upgrade when a process template changes – After going through an upgrade of over 170 project prior to the changes in the RC I can tell you that that many projects is no fun. Standardises your Process Template – You will always have the same Process implementation across projects/products without exception You get tighter control over the permissions – Yes, you can do this on a standard Team Project, but it gets a lot easier with practice. You can “move” work items from one “product” to another – Have we not always wanted to do that. You can rename your projects – Wahoo: everyone wants to do this, now you can. One set of Reporting Services reports to manage – You set an area and iteration to run reports anyway, so you may as well set both. Simplified Check-In Policies– There is only one set of check-in policies per client. This simplifies administration of policies. Simplified Alerts – As alerts are applied across multiple projects this simplifies your alert rules as per client. Cons All of these cons could be mitigated by a custom tool that helps automate creation of “Sub-projects” within Team Projects. This custom tool could create areas, Iteration, permissions, SharePoint and queries. It just does not exist yet :) You need to configure the Areas and Iterations You need to configure the permissions You may need to configure sub sites for SharePoint (depends on your requirement) – If you have two projects/products in the same Team Project then you will not see the burn down for each one out-of-the-box, but rather a cumulative for the Team Project. This is not really that much of a problem as you would have to configure your burndown graphs for your current iteration anyway. note: When you create a sub site to a TFS linked portal it will inherit the settings of its parent site :) This is fantastic as it means that you can easily create sub sites and then set the Area and Iteration path in each of the reports to be the correct one. Every team wants their own customization (via Ewald Hofman) - small teams of 2 persons against teams of 30 – or even outsourcing – need their own process, you cannot allow that because everybody gets the same work item types. note: Luckily at SSW this is not a problem as our template is standardised across all projects and customers. Large list of builds (via Ewald Hofman) – As the build list in Team Explorer is just a flat list it can get very cluttered. note: I would mitigate this by removing any build that has not been run in over 30 days. The build template and workflow will still be available in version control, but it will clean the list. Feedback Now that I have explained this method, what do you think? What other pros and cons can you see? What do you think of this approach? Will you be using it? What tools would you like to support you?   Technorati Tags: Visual Studio ALM,TFS Administration,TFS,Team Foundation Server,Project Planning,TFS Customisation

    Read the article

  • SQL SERVER – Server Side Paging in SQL Server 2011 – Part2

    - by pinaldave
    The best part of the having blog is that SQL Community helps to keep it running with new ideas. Earlier I wrote about SQL SERVER – Server Side Paging in SQL Server 2011 – A Better Alternative. A very popular article on that subject. I had used variables for “number of the rows” and “number of the pages”. Blog reader send me email asking in their organizations these values are stored in the table. Is there any the new syntax can read the data from the table. Absolutely YES! USE AdventureWorks2008R2 GO CREATE TABLE PagingSetting (RowsPerPage INT, PageNumber INT) INSERT INTO PagingSetting (RowsPerPage, PageNumber) VALUES(10,5) GO SELECT * FROM Sales.SalesOrderDetail ORDER BY SalesOrderDetailID OFFSET (SELECT RowsPerPage*PageNumber FROM PagingSetting) ROWS FETCH NEXT (SELECT RowsPerPage FROM PagingSetting) ROWS ONLY GO Here is the quick script: This is really an easy trick. I also wrote blog post on comparison of the performance over here: . SQL SERVER – Server Side Paging in SQL Server 2011 Performance Comparison Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology Tagged: SQL Paging

    Read the article

  • Oracle's Primavera P6 Analytics Now Available!

    - by mark.kromer
    Oracle's Primavera product team has announced this week that general availability of our first Oracle BI (OBI) based analytical product with pre-built business intelligence dashboards, reports and KPIs built in. P6 Analytics uses OBI's drill-down capabilities, summarizations, hierarchies and other BI features to provide knowledge to your business users to make the best decisions on portfolios, projects, schedules & resources with deep insights. Without needing to launch into the P6 tool, your executives, PMO, project sponsors, etc. can view up to date project performance information as well as historic trends of project performance. Using web-based portal technology, P6 Analytics makes it easy to manage by exception and then drill down to quickly identify root cause analysis of problem projects. At the same time, a brand new version of the P6 Reporting Database R2 was just announded and is also now available. This updated reporting database provides you with 4 star schemas with spread data and includes P6 activity, project and resource codes. You can use the data warehouse and ETL functions of the P6 Reporting Database R2 with your own reporting tools or build dashboards that utilize the hierarchies & drill down to the day-level on scheduled activities using Busines Objects, Cognos, Microsoft, etc. Both of these products can be downloaded from E-Delivery under the Primavera applications section in the P6 EPPM v7.0 media pack. I put some examples below of the resource utilization, earned value, landing page and portfolio analysis dashboards that come out of the box with P6 Analytics to give you these deep insights into your projects & portfolios on day 1 of using the tool. Please send an email to Karl or me if you have any questions or would like more information. Oracle Technology Network and the Oracle.com marketing sites are currently being refreshed with further details of these exciting new releases of the Primavera BI and data warehouse products. Lastly, scroll below for some screenshots of the new P6 Analytics R1 product using OBIEE! Thanks, Mark Kromer

    Read the article

  • New Release Overview Part 1

    - by brian.harrison
    Ladies & Gentlemen, I have been getting a lot of questions over the last month or two about the next release of WCI codenamed "Neo". Unfortunately I cannot give you an exact release date which I know you all would be asking me for if we were talking face to face, but I can definitely provide you with information about some of the features that will be made available. So over the next few blog entries, I am going to provide you with details about two features and even provide you with screenshots for some of them. KD Browser Portlet This portlet will provide a windows explorer look and feel to the Knowledge Directory from with a Community Page or My Page. Not only will the portlet provide access to the folder structure and the documents within, but the user or community manager will also have the ability to modify what is being shown. From with a preferences page, the user or community manager can change what top-level folders are shown within the folder structure as well as what properties are available for each document that is shown. There are also a number of other portlet specific customizations available as well. Embedded Tagging Engine As some of you might be aware, there was a product made available just prior to the Oracle acquisition known as Pathways which gave users the ability to add tags to documents that were either in the Knowledge Directory or in the Collaboration Documents section. Although this product is no longer available separately for customers to purchase, we definitely did feel that the functionality was important and interesting enough that other customers should have access to it. The decision was made for this release to embed the original Pathways product as the Tagging Engine for WCI and Collaboration. This tagging engine will allow a user to add tags to a document as well as through the Collaboration Documents section. Once the tags are added to the Tagging Engine and associated with documents, then a user will have the ability to filter the documents when processing a search according to the Tags Cloud that will now be available on the Search Results page and this will be true no matter what kind of search is being processed. In addition to all of that, all of the Pathways portlets will also be available for users to add to their My Page.

    Read the article

  • Visual Studio 11 not 2011

    - by Daniel Moth
    A little pet peeve of mine is when people incorrectly refer to the Developer Preview (or the upcoming Beta) as Visual Studio 2011 instead of the correct Visual Studio 11. The "11" refers to the version number (internally we call it Dev11). What the product will be called when it is released is anyone's guess (it could keep the name or it could have a year appended to it, or it could be something else, who knows). Even if it does have a year appended to the name, I think it is a safe bet it won't be last year! For reference, version 10 was the previous version of Visual Studio which happened to be released in 2010, hence it got the name Visual Studio 2010. That is what confuses new people to this product I guess... they think that the two-digit number matches the year, just because it coincided like that last year. (btw, internally we called it Dev10). For further reference, older releases were: Visual Studio 2008 (v9) aka "Orcas", Visual Studio 2005 (v8) aka "Whidbey", Visual Studio .NET 2003 (v7.1) aka "Everett", and Visual Studio .NET 2002 (v7) aka "Rainier". Before that, we were in the pre-.NET era with Visual Studio 6 (where the version and the product name matched, without the year appended to the name). So next time you hear someone saying "Visual Studio 2011", point them to this post for some mini-education... thanks. Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • New Agile PLM Customer Testimonial Videos on YouTube

    - by Kerrie Foy
    Have you visited the Oracle Agile PLM channel on YouTube recently? There are many new video testimonials, and even an overview of how Oracle Agile PLM helps companies drive powerful corporate performance by maximizing product profitability. Here are a few highlights... Oracle Agile PLM: Proven Results Watch an overview of the transformative success our customers have realized using Oracle Agile PLM applications to take their company to the next level. Alcatel-Lucent Ups Competitive Edge with Oracle Agile PLM and Oracle EBS Brad Magnani of Alcatel-Lucent Enterprise describes how the Oracle Agile PLM and Oracle EBS solutions help speed time to market, eliminate wasted cash, secure data, and ensure product quality, enabling innovation and success. Herbalife: an Oracle Agile PLM Customer Video Filmed at OpenWorld 2010 Listen to Gary Swanson of Herbalife describe how his organization realizes powerful new insight into product information with Agile PLM Business Intelligence (BI). Tyson: an Oracle Agile PLM for Process Customer Video Filmed at OpenWorld 2010, featuring Kim Glenn Tyson: an Oracle Agile PLM for Process Customer Video Filmed at OpenWorld 2010, featuring Amber Woods We are so proud to have two testimonials from Tyson Foods! Tune in to each to see the unique perspectives on Agile PLM for Process at Tyson from different organizational views, demonstrating Oracle's ability to enable enterprise-wide PLM implementations delivering superior results. Take a moment to view these interesting customer testimonials to learn how Oracle Agile PLM applications are helping companies succeed. Subscribe to our YouTube channel today!

    Read the article

  • Implement Budget Allocation in DAX for Power Pivot and Tabular #powerpivot #tabular #ssas #dax

    - by Marco Russo (SQLBI)
    Comparing sales and budget, or costs and budget, is a very common operation. However, it is often the case that you have different granularities for different tables containing budget and the data to compare with. There are two ways to do that: you can limit the comparison to the granularity that is common to the two tables, or you can allocate the budget where it’s not defined. For example, if you have a budget defined by quarter and category, you might want to allocate it by month and product. In this way, you will do the comparison as you had a more granular definition of the budget, without actually having to do the manual job of allocating data (usually in an Excel worksheet!). If you want to do budget allocation in DAX, you can use the Budget Patterns we published on DAX Patterns. If you come from and MDX/OLAP background, at first you might find it hard to solve the problem of not having attribute hierarchies that helps you in propagating the budget values to lower hierarchical levels. However, I think that once you get used to DAX, you will find the behavior very predictable and easy to “debug” also for more complex allocation formula. You just have to be careful in writing the DAX formula, but probably the pattern we wrote should help you designing the right data model, without creating physical relationships to the budget table! This pattern is also based on the Handling Different Granularities scenario I discussed a couple of weeks ago.

    Read the article

  • Why Standards Only Get You So Far

    - by Tim Murphy
    Over the years I have been exposed to a number of standards.  EDI was the first.  More recently it has been the CIECA standard for Insurance and now the embattled document standards of Open XML and ODF. Standards actually came up at the last CAG meeting.  The debate was over how effective they really are.  Even back in the late 80’s to early 90’s people found they had to customize these standards to get any work done.  I even had one vendor about a year ago tell me that they really weren’t standards, they were more of a guideline. The problem is that standards are created either by committee or by companies trying to sell a product.  They never fit all situations.  This is why most of them leave extension points in their definition.  Of course if you use those extension points everyone has to have custom code to know how to consume the new product. Standards increase reliability but they stifle innovation and slow the time to market cycle of products.  In this age of ever shortening windows of opportunity that could mean that a company could lose its competitive advantage. I believe that standards are not only good, but essential.  I also believe that they are not a silver bullet.  People who turn competing standards into a type of holy war are really missing the point.  I think we should make the best standards we can, whether that is for a product so that customers can use API, or by committee so that they cross products.  But they also need to be as feature rich and flexible as possible.  They can’t be just the lowest common denominator since this type of standard will be broken the day it is published.  In the end though, it is the market will vote with their dollars. del.icio.us Tags: Office Open XML,ODF,Standards,EDI

    Read the article

  • Surface RT: To Be Or Not To Be (Part 1)

    - by smehaffie
    So the Surface RT has been out for 9 months and Microsoft just declared a $900 million dollar write-down. So how did this happen and what does it mean for Microsoft’s efforts to break into the tablet market? I have been thinking a lot about most of the information below since the Surface product line was released. If you are looking for a “Microsoft Is Dead” story, then don’t read any further. But if you want an honest look at what I think led Microsoft to this point and what I think can be done to make Surface RT devices better, then please continue reading. What Led Microsoft To The $900 Million Write-Down Surface Unveiling:Microsoft totally missed the boat when they unveiled the Surface product line on June 18th, 2012. Microsoft should’ve been ready to post the specifications of both devices that night. Microsoft should’ve had a site up and running right after the event so people could pre-order the devices. This would have given them a good idea what the interest was in each device.  They could also have used this data to make a better estimate for the number of units to to have available for the launch and beyond.  They also lost out on taking advantage of the excitement generated by the Surface RT and Surface Pro announcement. They could have thrown in a free touch keyboard to anyone who pre-ordered. The advertising should have started right after the announcement and gotten bigger as launch day approached. Push for as many pre-order as possible and build excitement for the launch. Actual Launch (Surface RT): By this time all excitement was gone from the initial announcement, except for the Micorsoft faithful. Microsoft should have been ready to sell the Surface in as many markets as possible at launch. The limited market release was a real letdown for a lot of people.  A limited release right after the initial announce is understandable, but not at the official launch of the product. Microsoft overpriced the device and now they are lowering it to what it should have been to start with. The $349 price is within the range I suggested it should be at before pricing was announced. (Surface Tablets: The Price Must Be Right). Limited ordering options online was also a killer. User should have been able to buy the base unit of each device and then add on whatever keyboard they wanted to (this applies more to the Surface Pro).  There should have also been a place where users could order any additional add-ins that they wanted to buy (covers, extra power supplies, etc.) Marketing was better and the dancing “Click In” commercial was cool, but the ads comparing the iPad with Siri should have been on the air from day one of the announcement (or at least the launch).  Consumers want to know why you tablet is better, not just that is has a clickable keyboard and built-in kickstand. They could have also compared it to some of the other mid-range tablets if they had not overprices it to begin with. Stock Applications (Mail, People, Calendar, Music, Video, Reader and IE): This is where Microsoft really blew it. They had all the time in the world to make these applications the best of breed and instead we got applications that seemed thrown together.  Some updates have made these application better, but they are all still lacking in features that should have been there from day one. This did not help to enhance a new users experience any. ** I will admit that the applications that were data driven were first class citizen’s and that makes it even more perplexing why MS could knock it out of the park with the Weather, Travel, Finance, Bing, etc.) and fail so miserably on the core applications users would use the most on a tablet. Desktop on Tablet: The desktop just is so out of place on the tablet  I understand it was needed for Office but think it would have been better to not have the desktop in Windows RT, but instead open up the Office applications in full screen mode, in a desktop shell (same goes for  IE11).That way the user wouldn’t realize they are leaving Metro and going to the desktop. The other option would have been to just not include Office on Windows RT devices. Instead they could have made awesome Widows Store Apps for Word, Excel, OneNote and PowerPoint. In addition, they could have made the stock Mail, People, and Calendar applications contain all the functions that Outlook gives desktop users. Having some of the settings in desktop mode and others under “Change PC Settings” made Windows RT seemed unfinished and rushed to market. What Can Be Done To Make Windows RT Based Tablets Better (At least in my opinion) Either eliminate the desktop all together from Windows RT or at least make the user experience better by hiding the fact the user is running Office/IE in the desktop. Personally I ‘d like them to totally get rid of it and just make awesome Windows Store Application version of Word, Excel PowerPoint & OneNote.  This might also make the OS smaller and give the user more available disk space. I doubt there will ever be a Windows Store App versions of Office, but I still think it is a good idea. Make is so users can easily direct their documents, picture, videos and music to their extra storage and can access these files from the standard libraries.  A user should not have to create a VM on their microSD card or create symbolic links to get this to work properly. Most consumers would not be able to do this. Then users get frustrated when they run out or room on their main storage because nothing is automatically save to their microSD card when saved to libraries.  This is a major bug that needs to be fixed, otherwise Microsoft’s selling point of having a microSD slot is worthless. Allows users to uninstall and re-install any of the Office product that come with the Surface. That way people can free up storage space by uninstalling the Office applications they do not need. Everyone’s needs are different, so make the options flexible. Don’t take up storage space for applications the user will not use. Make the Core applications the “Cream of the Crop” Windows App Store applications. The should set the bar for all other Store applications. Improve performance as much as possible, if it seems to be sluggish on a tablet consumer will not buy it. They need to price the next line of Surface product very aggressive to undercut not only iPad but also Android low end tablets (Nook, Kindle Fire, and Nexus, etc.) Give developers incentives to write quality applications for the devices. Don’t reward developers for cranking out cookie cutter, low quality applications. I’d even suggest Microsoft consider implementing some new store certification guideline to stop these type of applications being published. Allow users to easily move the recover disk “partition between their microSD card and main storage. My Predictions for the Surface RT and Windows RT I honestly think even with all the missteps MS has made since the announcement  about the Surface product line, that they are on the right path. I was excited the Surface tablets when they were announced, and I still am. The truth be told, Windows 8 on a tablet (aka: Windows RT) is better than both iOS and Android. My nephew who is an Apple fan boy told me after he saw and used Windows 8 (he got the beta running on his iPad), that Windows 8 kicked Apples butt as a tablet OS. So there is hope for all Windows RT based tablets. I agree with my nephew and that is why whenever anyone asks me about my Surface, I love showing it off and recommend it. The 6 keys to gaining market share in the tablet market are; Aggressive pricing by both Microsoft and their OEM’s Good quality devices put out by Microsoft and their OEM’s (there are some out there, but not enough) Marketing, Marketing, Marketing from both Microsoft and their OEM’s (Need more ads showing why windows based tablets are better than iPads and Android tablets) Getting Widows tablets in retails stores all over, and giving sales people incentive to sell them. Consumers like to try electronics out before they buy them, and most will listen to what the sales person suggest. Microsoft needs sales people in retail stores directing people to buy windows based tablets over iPads and Android tablets. I think the Microsoft Stores within Best Buy is a good start, but they also need to get prominent displays in Walmart, Target, etc.. Release a smaller form factor Surface, Hopefully the 8”-10” next generation Surface is not a rumor. Make “Surface” the brand name for all Microsoft tablets and hybrid devices that they come out with. They cannot change the name with each new release.  Make Surface synonymous with quality, the same way that iPad  is for Apple. Well, that is my 2 cents on the subject. Let me know your thoughts by leaving a comment below. Soon to follow will be my thought on the Surface Pro, so keep an eye out for it. var addthis_pub="smehaffie"; var addthis_options="email, print, digg, slashdot, delicious, twitter, live, myspace, facebook, google, stumbleupon, newsvine";

    Read the article

  • Welcome to the Weblog on Oracle ADF Mobile!

    - by joe.huang
    Welcome to ADF Mobile team's weblog.  My name is Joe Huang - I am the product manager for ADF Mobile.  Oracle ADF Mobile is a part of Oracle's Application Development Framework (ADF) that support the development of enterprise/business applications that run on mobile devices.  The development tool for this framework is of course Oracle JDeveloper.  As some of you may know, we currently support the development of mobile browser-based application - this part of product is called ADF Mobile Browser.  Additionally, we are close to release a technology preview of ADF Mobile Client, which supports development of on-device, disconnect capable mobile applications.  What's truly unique about ADF Mobile development process is that it's a very visual and declarative experience, while still allow power Java developers to completely extend the framework to their liking.  The framework also provides a rich set of services needed by an enterprise-grade mobile application - these services would literally take years to implement if they are to be built from the ground up.  However, by using JDeveloper and ADF Mobile, you get the entire framework at your service!In the coming entries, the ADF Mobile product development team will publish any news, best practices, our observation on mobile technology trends, or just our experiences in playing with "gadgets".  Be sure to check back on this page!Sincerely,Joe HuangOracle

    Read the article

  • Site too large to officially use Google Analytics?

    - by Jeff Atwood
    We just got this email from the Google Analytics team: We love that you love our product and use it as much as you do. We have observed however, that a website you are tracking with Google Analytics is sending over 1 million hits per day to Google Analytics servers. This is well above the "5 million pageviews per month per account" limit specified in the Google Analytics Terms of Service. Processing this amount of data multiple times a day takes up valuable resources that enable us to continue to develop the product for all Google Analytics users. Processing this amount of data multiple times a day takes up valuable resources that enable us to continue to develop the product for all Google Analytics users. As such, starting August 23rd, 2010, the metrics in your reports will be updated once a day, as opposed to multiple times during the course of the day. You will continue to receive all the reports and features in Google Analytics as usual. The only change will be that data for a given day will appear the following day. We trust you understand the reasons for this change. I totally respect this decision, and I think it's very generous to not kick us out. But how do we do this the right way -- what's the official, blessed Google way to use Google Analytics if you're a "whale" website with lots of hits per day? Or, are there other analytics services that would be more appropriate for very large websites?

    Read the article

  • Learn How to Use Oracle’s Spatial and BI Tools for Location-aware Predictive Analytics

    - by Mandy Ho
    November 29, 2-3pm EST Are you a OBIEE (Oracle Business Intelligence Enterprise Edition) user? Have Location data you'd like to incorporate into your analysis as well? This is a great webinar for you! Join us, as Oracle experts from both teams show how to perform perdictive analytics, network analytics and spatial analysis, combined together, in real world scenarios. We will include demos evaluating airline on-time performance and retail establishment performance.  Learn how to: - Gain better business insights and improve ROI with Oracle Spatial and Graph, Oracle Advanced Analytics, and Oracle Business Intelligence Enterprise Edition (OBIEE). - Streamline and remove the complexity of building applications with OBIEE’s built-in location and analytics features. - Create the statistical model, build interactive reports and dashboards including location analysis and map visualization, and incorporate network analytics for geomarketing and site scoring. - Perform location analysis and processing such as proximity, containment, geocoding, aggregation of geographic regions, and more. Speakers include Jayant Sharma, Director, Product Management, Oracle Spatial and Mapping Technologies; Jean Ihm, Principal Product Manager, Oracle Spatial and Mapping Technologies; and Abhinav Agarwal, OBIEE Product Management. Who should attend This webinar is appropriate for CIOs, business and technical managers, developers, and analysts involved in design and management of analytic applications and solutions where spatial analysis can add insight and value to business processes. Click here, or the link below to sign up today! https://www2.gotomeeting.com/register/764677554

    Read the article

  • The Red Gate and .NET Reflector Debacle

    - by Rick Strahl
    About a month ago Red Gate – the company who owns the NET Reflector tool most .NET devs use at one point or another – decided to change their business model for Reflector and take the product from free to a fully paid for license model. As a bit of history: .NET Reflector was originally created by Lutz Roeder as a free community tool to inspect .NET assemblies. Using Reflector you can examine the types in an assembly, drill into type signatures and quickly disassemble code to see how a particular method works.  In case you’ve been living under a rock and you’ve never looked at Reflector, here’s what it looks like drilled into an assembly from disk with some disassembled source code showing: Note that you get tons of information about each element in the tree, and almost all related types and members are clickable both in the list and source view so it’s extremely easy to navigate and follow the code flow even in this static assembly only view. For many year’s Lutz kept the the tool up to date and added more features gradually improving an already amazing tool and making it better. Then about two and a half years ago Red Gate bought the tool from Lutz. A lot of ruckus and noise ensued in the community back then about what would happen with the tool and… for the most part very little did. Other than the incessant update notices with prominent Red Gate promo on them life with Reflector went on. The product didn’t die and and it didn’t go commercial or to a charge model. When .NET 4.0 came out it still continued to work mostly because the .NET feature set doesn’t drastically change how types behave.  Then a month back Red Gate started making noise about a new Version Version 7 which would be commercial. No more free version - and a shit storm broke out in the community. Now normally I’m not one to be critical of companies trying to make money from a product, much less for a product that’s as incredibly useful as Reflector. There isn’t day in .NET development that goes by for me where I don’t fire up Reflector. Whether it’s for examining the innards of the .NET Framework, checking out third party code, or verifying some of my own code and resources. Even more so recently I’ve been doing a lot of Interop work with a non-.NET application that needs to access .NET components and Reflector has been immensely valuable to me (and my clients) if figuring out exact type signatures required to calling .NET components in assemblies. In short Reflector is an invaluable tool to me. Ok, so what’s the problem? Why all the fuss? Certainly the $39 Red Gate is trying to charge isn’t going to kill any developer. If there’s any tool in .NET that’s worth $39 it’s Reflector, right? Right, but that’s not the problem here. The problem is how Red Gate went about moving the product to commercial which borders on the downright bizarre. It’s almost as if somebody in management wrote a slogan: “How can we piss off the .NET community in the most painful way we can?” And that it seems Red Gate has a utterly succeeded. People are rabid, and for once I think that this outrage isn’t exactly misplaced. Take a look at the message thread that Red Gate dedicated from a link off the download page. Not only is Version 7 going to be a paid commercial tool, but the older versions of Reflector won’t be available any longer. Not only that but older versions that are already in use also will continually try to update themselves to the new paid version – which when installed will then expire unless registered properly. There have also been reports of Version 6 installs shutting themselves down and failing to work if the update is refused (I haven’t seen that myself so not sure if that’s true). In other words Red Gate is trying to make damn sure they’re getting your money if you attempt to use Reflector. There’s a lot of temptation there. Think about the millions of .NET developers out there and all of them possibly upgrading – that’s a nice chunk of change that Red Gate’s sitting on. Even with all the community backlash these guys are probably making some bank right now just because people need to get life to move on. Red Gate also put up a Feedback link on the download page – which not surprisingly is chock full with hate mail condemning the move. Oddly there’s not a single response to any of those messages by the Red Gate folks except when it concerns license questions for the full version. It puzzles me what that link serves for other yet than another complete example of failure to understand how to handle customer relations. There’s no doubt that that all of this has caused some serious outrage in the community. The sad part though is that this could have been handled so much less arrogantly and without pissing off the entire community and causing so much ill-will. People are pissed off and I have no doubt that this negative publicity will show up in the sales numbers for their other products. I certainly hope so. Stupidity ought to be painful! Why do Companies do boneheaded stuff like this? Red Gate’s original decision to buy Reflector was hotly debated but at that the time most of what would happen was mostly speculation. But I thought it was a smart move for any company that is in need of spreading its marketing message and corporate image as a vendor in the .NET space. Where else do you get to flash your corporate logo to hordes of .NET developers on a regular basis?  Exploiting that marketing with some goodwill of providing a free tool breeds positive feedback that hopefully has a good effect on the company’s visibility and the products it sells. Instead Red Gate seems to have taken exactly the opposite tack of corporate bullying to try to make a quick buck – and in the process ruined any community goodwill that might have come from providing a service community for free while still getting valuable marketing. What’s so puzzling about this boneheaded escapade is that the company doesn’t need to resort to underhanded tactics like what they are trying with Reflector 7. The tools the company makes are very good. I personally use SQL Compare, Sql Data Compare and ANTS Profiler on a regular basis and all of these tools are essential in my toolbox. They certainly work much better than the tools that are in the box with Visual Studio. Chances are that if Reflector 7 added useful features I would have been more than happy to shell out my $39 to upgrade when the time is right. It’s Expensive to give away stuff for Free At the same time, this episode shows some of the big problems that come with ‘free’ tools. A lot of organizations are realizing that giving stuff away for free is actually quite expensive and the pay back is often very intangible if any at all. Those that rely on donations or other voluntary compensation find that they amount contributed is absolutely miniscule as to not matter at all. Yet at the same time I bet most of those clamouring the loudest on that Red Gate Reflector feedback page that Reflector won’t be free anymore probably have NEVER made a donation to any open source project or free tool ever. The expectation of Free these days is just too great – which is a shame I think. There’s a lot to be said for paid software and having somebody to hold to responsible to because you gave them some money. There’s an incentive –> payback –> responsibility model that seems to be missing from free software (not all of it, but a lot of it). While there certainly are plenty of bad apples in paid software as well, money tends to be a good motivator for people to continue working and improving products. Reasons for giving away stuff are many but often it’s a naïve desire to share things when things are simple. At first it might be no problem to volunteer time and effort but as products mature the fun goes out of it, and as the reality of product maintenance kicks in developers want to get something back for the time and effort they’re putting in doing non-glamorous work. It’s then when products die or languish and this is painful for all to watch. For Red Gate however, I think there was always a pretty good payback from the Reflector acquisition in terms of marketing: Visibility and possible positioning of their products although they seemed to have mostly ignored that option. On the other hand they started this off pretty badly even 2 and a half years back when they aquired Reflector from Lutz with the same arrogant attitude that is evident in the latest episode. You really gotta wonder what folks are thinking in management – the sad part is from advance emails that were circulating, they were fully aware of the shit storm they were inciting with this and I suspect they are banking on the sheer numbers of .NET developers to still make them a tidy chunk of change from upgrades… Alternatives are coming For me personally the single license isn’t a problem, but I actually have a tool that I sell (an interop Web Service proxy generation tool) to customers and one of the things I recommend to use with has been Reflector to view assembly information and to find which Interop classes to instantiate from the non-.NET environment. It’s been nice to use Reflector for this with its small footprint and zero-configuration installation. But now with V7 becoming a paid tool that option is not going to be available anymore. Luckily it looks like the .NET community is jumping to it and trying to fill the void. Amidst the Red Gate outrage a new library called ILSpy has sprung up and providing at least some of the core functionality of Reflector with an open source library. It looks promising going forward and I suspect there will be a lot more support and interest to support this project now that Reflector has gone over to the ‘dark side’…© Rick Strahl, West Wind Technologies, 2005-2011

    Read the article

< Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >