Search Results

Search found 1266 results on 51 pages for 'culture'.

Page 39/51 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Growing Talent

    The subtitle of Daniel Coyles intriguing book The Talent Code is Greatness Isnt Born. Its Grown. Heres How. The Talent Code proceeds to layout a theory of how expertise can be cultivated through specific practices that encourage the growth of myelin in the brain. Myelin is a material that is produced and wraps around heavily used circuits in the brain, making them more efficient. Coyle uses an analogy that geeks will appreciate. When a circuit in the brain is used a lot (i.e. a specific action is repeated), the myelin insulates that circuit, increasing its bandwidth from telephone over copper to high speed broadband. This leads to the funny phenomenon of effortless expertise. Although highly skilled, the best players make it look easy. Coyle provides some biological backing for the long held theory that it takes 10,000 hours of practice to achieve mastery over a given subject. 10,000 hours or 10 years, as in, Teach Yourself Programming in Ten Years and others. However, it is not just that more hours equals more mastery. The other factors that Coyle identifies includes deep practice, practice which crucially involves drills that are challenging without being impossible. Another way to put it is that every day you spend doing only tasks you find monotonous and automatic, you are literally stagnating your brains development! Perhaps Coyles subtitle, needs one more phrase, Greatness Isnt Born. Its Grown. Heres How. And oh yeah, its not easy. Challenging yourself, continuing to persist in the face of repeated failures, practicing every day is not easy. As consultants, we sell our expertise, so it makes sense that we plan projects so that people can play to their strengths. At the same time, an important part of our culture is constant improvement, challenging yourself to be better. And the balancing contest ensues. I just finished working on a proof of concept (POC) we did for a project we are bidding on. Completely time boxed, so our team naturally split responsibilities amongst ourselves according to who was better at what. I must have been pretty bad at the other components, as I found myself working on the user interface, not my usual strength. The POC had a website frontend, and one thing I do know is HTML. After starting out in pure ASP.NET WebForms, I got frustrated as time was ticking, I knew what I wanted in HTML, but I couldnt coax the right output out of the ASP.NET controls. I needed two or three elements on the screen that were identical in layout, with different content. With a backup plan in  of writing the HTML into the response by hand, I decided to challenge myself a bit and see what I could do in an hour or two using the Microsoft submitted jQuery micro-templating JavaScript library. This risk paid off. I was able to quickly get the user interface up and running, responsive to the JSON data we were working with. I felt energized by the double win of getting the POC ready and learning something new. Opportunities  specifically like this POC dont come around often, but the takeaway is that while it wont be easy, there are ways to generate your own opportunities to grow towards greatness.Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Attending a Career Fair: &ldquo;Don&rsquo;t be shy &ndash; Be prepared&rdquo;

    - by jessica.ebbelaar
    There are a large number of ways to interact with companies nowadays. The career fair is a very effective and personal way to interact with a number of different companies in a very short period of time. Here are some simple tips to help you perform during a career fair. Do research The key to being successful at a career fair is to do research before you go. Make a first selection of the companies you feel could be interesting for you. Include many types of employers. Once you have decided on the list of companies you want to visit, go to their career portal. Inform yourself about what the company does, i.e what roles there are available, how the company culture is described, what impression the testimonials give you. The question that you still have after reviewing this information, are the ones you can discuss with the company on the fair. Sell yourself Visit the companies you have on your top 5 list first, so you will be at your highest energy level to make that first impression. Think in advance about what you are going to tell the recruiter. Prepare a 30-second introduction (including degree, strengths, skills & experience) Be confident when you talk about your experience. Remember to start the conversation with a smile, make good eye contact and give a firm handshake. You could be speaking to your next manager, so be professional! If you already know what jobs you are interested in, relate your skills and experience to the roles that the company has available. If you are not yet sure gather as much information as you can about employment and/or hiring procedures, specific skills necessary for different jobs, training and career paths. Stand out As career fairs are very crowded and the attending companies meet with a lot of potential candidates on one day, you have to make sure you are noticed in a positive way. A good preparation and asking questions that show you have a good understanding of the industry, organization and roles will help you. Be aware of time demands on employers. Do not monopolize an employer's time. Dress appropriately to make a good first impression. Bring your resume Do not forget to bring your resume in print or on a USB-stick to the fair. If you are searching for different types of jobs, bring different versions of your resume. Your resume should be short and professional on white paper that is free of graphics or fancy print styles and containing larger margins for interviewer notes. Follow up After each conversation ask who you can contact for follow-up discussions about the specific roles. Use the back of a business card to record notes that help you remember important details and follow-up instructions. If no card is available, record the contact information and your comments in your notepad or phone. Last but not least, thank everyone you talk to for their time. Follow up as soon as possible with thank you notes that address the companies’ hiring needs, your qualifications, and express your desire for a second interview. What not to do… Do not visit a company with a group of friends. Interact with the companies on your own, to make your own positive impression. Do not walk up to a recruiter and interrupt a current conversation; wait your turn and be polite. What you should absolutely avoid is a grab and run on freebies! Take the time to speak to the company and ask for a freebie at the end of the conversation in case they are not offered to you. Good luck with the preparations for the career fair you will attend. Oracle recruiters look forward to meet you! They will be present on a large number of fairs in the region. For an overview of the fairs go to the Events & Calendar page on http://campus.oracle.com If you have any questions related to this article feel free to contact [email protected].

    Read the article

  • Where Twitter Stands Heading Into 2013

    - by Mike Stiles
    As Twitter continued throughout 2012 to be a stage on which global politics and culture played itself out, the company itself underwent some adjustments that give us a good indication of what users and brands can expect from the platform in 2013. The power of the network did anything but fade. Celebrities continued to use it to connect one-on-one. Even the Pope signed on this year. It continued to fuel revolutions. It played an exponentially large factor in this US Presidential election. And around the world, the freedom to speak was challenged as users were fired, sued, sometimes even jailed for their tweets. Expect more of the same in 2013, as Twitter has entrenched itself, for individuals, causes and brands, as the fastest, easiest, most efficient way to message the masses so some measure of impact can come from it. It’s changed everything, and it’s not finished. These fun facts reveal the position of strength with which Twitter enters 2013: It now generates a billion tweets every 2.5 days It has 500 million+ users The average Twitter user has tweeted 307 times 32% of everyone using the Internet uses Twitter It’s expected to bring in $540 million in ad revenue by 2014 11 new accounts are created every second High-level Executive Summary: people love it, people use it, and they’re going to keep loving and using it. Whether or not outside developers love it is a different matter. 2012 marked a shift from welcoming the third party support that played at least some role in Twitter being so warmly embraced, to discouraging anything that replicates what Twitter can do itself…or plans to do itself. It’s not the open playground it once was. Now Twitter must spend 2013 proving it can innovate in-house and keep us just as entranced. Likewise, Twitter is distancing itself from Facebook. Images from the #1 platform’s Instagram don’t work on Twitter anymore, and Twitter’s rolling out their own photo filter product. Where the two have lived in a “plenty of room for everybody” symbiosis up to now, 2013 could see the giants ramping up a full-on rivalry. Twitter is exhibiting a deliberate strategy. Updates have centered on more visually appealing search results, and making finding and sharing content easier. Deals have been cut with some media entities so their content stands out. CEO Dick Costolo has said tweets aren’t the attraction, they’re what leads you to content. Twitter aims to be a key distributor of media and info. Add the addition of former News Corp. President Peter Chernin to the board, and their hashtag landing page experience for events, and their media behemoth ambitions get pretty clear. There are challenges ahead and Costolo has also laid those out; entry into China, figuring out how to have Twitter deliver both comprehensive and relevant, targeted experiences, and the visualization of big data. What does this mean for corporations? They can expect a more media-rich evolution and growing emphases on imagery. They can expect more opportunities to create great media content and leverage Twitter for its distribution. And they can expect new ways to surface in searches. Are brands diving in? 56% of customer tweets to companies get completely and totally ignored. Ugh. A study Twitter recently conducted with Compete shows people who see tweets from retailers are more likely to buy a product. And, the more retailer tweets they see, the more likely they are to purchase on the retail site. As more of those tweets point to engaging media content from the brand, the results should get even better. Twitter appears ready for 2013. Enterprise brands have some work to do. @mikestilesPhoto Stuart Miles, freedigitalphotos.net

    Read the article

  • Head in the Clouds

    - by Tony Davis
    We're just past the second anniversary of the launch of Windows Azure. A couple of years' experience with Azure in the industry has provided some obvious success stories, but has deflated some of the initial marketing hyperbole. As a general principle, Azure seems to work well in providing a Service-Oriented Architecture for services in enterprises that suffer wide fluctuations in demand. Instead of being obliged to provide hardware sufficient for the occasional peaks in demand, one can hire capacity only when it is needed, and the cost of hosting an application is no longer a capital cost. It enables companies to avoid having to scale out hardware for peak periods only to see it underused for the rest of the time. A customer-facing application such as a concert ticketing system, which suffers high demand in short, predictable bursts of activity, is a great example of an application that would work well in Azure. However, moving existing applications to Azure isn't something to be done on impulse. Unless your application is .NET-based, and consists of 'stateless' components that communicate via queues, you are probably in for a lot of redevelopment work. It makes most sense for IT departments who are already deep in this .NET mindset, and who also want 'grown-up' methods of staging, testing, and deployment. Azure fits well with this culture and offers, as a bonus, good Visual Studio integration. The most-commonly stated barrier to porting these applications to Azure is the problem of reconciling the use of the cloud with legislation for data privacy and security. Putting databases in the cloud is a sticky issue for many and impossible for some due to compliance and security issues, the need for direct control over data, and so on. In the face of feedback from the early adopters of Azure, Microsoft has broadened the architectural choices to cater for a wide range of requirements. As well as SQL Azure Database (SAD) and Azure storage, the unstructured 'BLOB and Entity-Attribute-Value' NoSQL storage alternative (which equates more closely with folders and files than a database), Windows Azure offers a wide range of storage options including use of services such as oData: developers who are programming for Windows Azure can simply choose the one most appropriate for their needs. Secondly, and crucially, the Windows Azure architecture allows you the freedom to produce hybrid applications, where only those parts that need cloud-based hosting are deployed to Azure, whereas those parts that must unavoidably be hosted in a corporate datacenter can stay there. By using a hybrid architecture, it will seldom, if ever, be necessary to move an entire application to the cloud, along with personal and financial data. For example that we could port to Azure only put those parts of our ticketing application that capture and process tickets orders. Once an order is captured, the financial side can be processed in our own data center. In short, Windows Azure seems to be a very effective way of providing services that are subject to wide but predictable fluctuations in demand. Have you come to the same conclusions, or do you think I've got it wrong? If you've had experience with Azure, would you recommend it? It would be great to hear from you. Cheers, Tony.

    Read the article

  • Brazil is Hot for Social Media

    - by Mike Stiles
    Today’s guest blog is from Oracle SVP Product Development Reggie Bradford, fresh off a visit to Sao Paulo, Brazil where he spoke at the Dachis Social Business Summit and spent some time getting a personal taste for the astonishing growth of social in Brazil, both in terms of usage and engagement. I knew it was big, but I now have an all-new appreciation for why the Wall Street Journal branded Brazil the “social media capital of the universe.” Brazil has the world’s 5th largest economy, an expanding middle class, an active younger demo market, a connected & outgoing culture, and an ongoing embrace of the social media platforms. According to comScore's 2012 Brazil Digital Future in Focus report, 97% are using social media, and that’s not even taking mobile-only users into account. There were 65 million Facebook users in 2012, spending an average 535 minutes there, up 208%. It’s one of Twitter’s fastest growing markets and the 2nd biggest market for YouTube. Instagram usage has grown over 300% since last year. That by itself is exciting, but look at the opportunity for social marketing brands. 74% of Brazilian social users follow brands on Facebook, and 59% have praised a company on either Twitter or Facebook. A 2011 Oh! Panel study found 81% of social networkers there used social to research new products and 75% went there looking for discounts. B2C eCommerce sales in Brazil is projected to hit $26.9 billion by 2015. I bet I’m not the only one who sees great things ahead, and I was fortunate enough give a keynote ABRADI, an association of leading digital agencies in Brazil with 53 execs from 35 agencies attending. I was also afforded the opportunity to give my impressions of what’s going on in Brazil to Jornal Propoganda & Marketing, one of the most popular publications in Latin America for marketers. I conveyed that especially in an environment like Brazil, where social users are so willing to connect and engage brands, marketers need to back away from the heavy-handed, one-way messaging of old school advertising and move toward genuine relationships and trust-building. To aide in this, organizational and operation changes must be embraced inside the enterprise. We've talked often about the new, tighter partnership forming between the CIO and CMO. If this partnership is not encouraged, fostered and resourced, the increasing amount of time consumers spend on mobile and digital, and the efficiencies and integrations offered by cloud-based software cannot be exploited. These are the kinds of changes that can yield social data that, when combined with enterprise data, helps you come to know your social audiences intimately and predict their needs. Consumers are always connected and need your brand to be accessible at any time, be it for information or customer service. And, of course, all of this is happening quite publicly. The holistic, socially-enable enterprise connects social to customer service systems and all other customer touch points, facilitating the kind of immediate, real-time, gratifying response customers are coming to expect. Social users in Brazil are highly active and clearly willing to meet us as brands more than halfway. Empowering yourself with a social management technology platform will have you set up to maximize this booming social market…from listening & monitoring to engagement to analytics to workflow & automation to globalization & language support. Brands, it’s time to be as social as the great people of Brazil are. Obrigado! @reggiebradfordPhoto: Gualberto107, freedigitalphotos.net

    Read the article

  • When things go awry

    - by Phil Factor
    The moment the Entrepreneur opened his mouth on prime-time national TV, spelled out the URL and waxed big on how exciting ‘his’ new website was, I knew I was in for a busy night. I’d designed and built it. All at once, half a million people tried to log into the website. Although all my stress-testing paid off, I have to admit that the network locked up tight long before there was any danger of a database or website problem. Soon afterwards, the Entrepreneur and the Big Boss were there in the autopsy meeting. We picked through all our systems in detail to see how they’d borne the unexpected strain. Mercifully, in view of the sour mood of the Big Boss, it turned out that the only thing we could have done better was buy a bigger pipe to and from the internet. We’d specified that ‘big pipe’ when designing the system. The Big Boss had then railed at the cost and so we’d subsequently compromised. I felt that my design decisions were vindicated. The Big Boss brooded for a while. Then he made the significant comment: “What really ****** me off is the fact that, for ten minutes, we couldn’t take people’s money.” At that point I stopped feeling smug. Had the internet connection been better, the system would have reached its limit and failed rather precipitously, and that wasn’t what he wanted. Then it occurred to me that what had gummed up the connection was all those images on the site, that had made it so impressive for the visitors. If there had been a way to automatically pare down the site to the bare essentials under stress… Hmm. I began to consider disaster-recovery in the broadest sense – maintaining a service in spite of unusual or unexpected events. What he said makes a lot of sense: sacrifice whatever isn’t essential to keep the core service running when we approach the capacity limits. Maybe in IT we should borrow (or revive) the business concept of the ‘Skeleton service’, maintaining only the priority parts under stress, using a process that is well-prepared and carefully rehearsed. How might this work? Whatever the event we have to prepare for, it is all about understanding the priorities; knowing what one can dispense with when the going gets tough. In the event of database disaster, it’s much faster to deploy a skeletal system with only the essential data than to restore the entire system, though there would have to be a reconciliation process to update the revived database retrospectively, once the emergency was over. It isn’t just the database that could be designed for resilience. One could prepare for unusually high traffic in a website by designing a system that degraded gradually to a ‘skeletal’ site, one that maintained the commercial essentials without fat images, JavaScript libraries and razzmatazz. This is all what the Big Boss scathingly called ‘a mere technicality’. It seems to me that what is needed first is a culture of application and database design which acknowledges that we live in a very imperfect world, and react accordingly when things go awry.

    Read the article

  • In the Firing Line: The impact of project and portfolio performance on the CEO

    - by Melissa Centurio Lopes
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} What are the primary measurements for rating CEO performance? For corporate boards, business analysts, investors, and the trade press the metrics they deploy are relatively binary in nature; what is being done to generate earnings, and what is being done to build and sustain high performance? As for the market, interest is primarily aroused when operational and financial performance falls outside planned commitments for the year. When organizations announce better than predicted results, they usually experience an immediate increase in share price. Likewise, poor results have an obviously negative impact on the share price and impact the role and tenure of the incumbent CEO. The danger for the CEO is that the risk of failure is ever present, ranging from manufacturing delays and supply chain issues to labor shortages and scope creep. This risk is enhanced by the involvement of secondary suppliers providing services critical to overall work schedules, and magnified further across a portfolio of programs and projects underway at any one time – and all set within a global context. All can impact planned return on investment and have an inevitable impact on the share price – the primary empirical measure of day-to-day performance. Read this complete complementary report, In the Firing Line and explore what is the direct link between the health of the portfolio and CEO performance. This report will provide an overview of the responsibility the CEO has for implementing and maintaining a culture of accountability, offer examples of some of the higher profile project failings in recent years, and detail the capabilities available to the CEO to mitigate the risks residing in their own portfolios. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • No Customer Left Behind

    - by Kathryn Perry
    A guest post by David Vap, Group Vice President, Oracle Applications Product Development What does customer experience mean to you? Is it a strategy for your executives? A new buzz word and marketing term? A bunch of CRM technology with social software added on? For me, customer experience is a customer-centric worldview that produces a deeper understanding of your business and what it takes to achieve sustainable, differentiated success. It requires you to prioritize and examine the journey your customers are on with your brand, so you can answer the question, "How can we drive greater value for our business by delivering a better customer experience?" Businesses that embrace a customer-centric worldview understand their business at a much deeper level than most. They know who their customers are, what their value is, what they do, what they say, what they want, and ultimately what that means to their business. "Why Isn't Everyone Doing It?" We're all consumers who have our own experiences with many brands. Good or bad, some of those experiences stay with us. So viscerally we understand the concept of customer experience from the stories we share. One that stands out in my mind happened as I was preparing to leave for a 12-month job assignment in Europe. I wanted to put my cable television subscription on hold. I wasn't leaving for another vendor. I wasn't upset. I just had a situation where it made sense to put my $180 per month account on pause until I returned. Unfortunately, there was no way for this cable company to acknowledge that I was a loyal customer with a logical request - and to respond accordingly. So, ultimately, they lost my business. Research shows us that it costs six to seven times more to acquire a new customer than to retain an existing one. Heavily funding the efforts of getting new customers and underfunding the efforts of serving the needs of your existing (who are your greatest advocates) is a vicious and costly cycle. "Hey, These Guys Suck!" I love my Apple iPad because it's so easy to use. The explosion of these types of technologies, combined with new media channels, has raised our expectations and made us hyperaware of what's going on and what's available. In addition, social media has given us a megaphone to share experiences both positive and negative with greater impact. We are now an always-on culture that thrives on our ability to access, connect, and share anywhere anytime. If we don't get the service, product, or value we expect, it is easy to tell many people about it. We also can quickly learn where else to get what we want. Consumers have the power of influence and choice at a global scale. The businesses that understand this principle are able to leverage that power to their advantage. The ones that don't, suffer from it. Which camp are you in?Note: This is Part 1 in a three-part series. Stop back for Part 2 on November 19.

    Read the article

  • Using ReportViewer 9 control in VS 2010

    - by Fermin
    Hi, I am writing an ASP.NET app that uses a SQL Server 2005 with SSRS setup. I want to use the ReportViewer control but I get an error when using ReportViewer 10 because it needs SSRS 2008. How can I use ReportViewer 9 within my application. I've added a reference to the Microsoft.ReportViewer.WebForms.dll version 9 and removed the reference to version 10. My markup is as follows: <%@ Register Assembly="Microsoft.ReportViewer.WebForms, Version=9.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a" Namespace="Microsoft.Reporting.WebForms" TagPrefix="rsweb" %> <!-- standard markup --> <rsweb:ReportViewer ID="ReportViewer1" runat="server"></rsweb:ReportViewer> but when I try to run this I get the following error: CS0433: The type 'Microsoft.Reporting.WebForms.ReportViewer' exists in both 'c:\WINDOWS\assembly\GAC_MSIL\Microsoft.ReportViewer.WebForms\10.0.0.0__b03f5f7f11d50a3a\Microsoft.ReportViewer.WebForms.dll' and 'c:\WINDOWS\assembly\GAC_MSIL\Microsoft.ReportViewer.WebForms\9.0.0.0__b03f5f7f11d50a3a\Microsoft.ReportViewer.WebForms.dll' What have I missed!? Update: When trying to use the ReportViewer 10 I get the following error: "Remote report processing requires Microsoft SQL Server 2008 Reporting Services or later."

    Read the article

  • Could not load file or assembly FSharp.Core, Version=4.0.0.0

    - by Ken
    I'm trying to deploy a web application which uses F# 4.0 on Windows Server 2008. It works on my computer where VS2010 is installed but it doesn't work on the server. Everytime you open the page you'll get this error message: Could not load file or assembly 'FSharp.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified. I've installed .NET 4 using the web platform installer. F# PowerPack is installed too. I found this page: http://connect.microsoft.com/VisualStudio/feedback/details/507202/error-in-working-with-f It suggests you to reinstall F#, but the link to download F# seems to be broken. And it might not be the same problem I have. I've also tried to install Microsoft F# 2.0.0.0 since it's the only F# redistribution I could find. But it doesn't help at all. Has anyone get something like this to work? Any help would be appreciated. Thanks.

    Read the article

  • ASP.NET MVC tries to load older version of Owin assembly

    - by d_mcg
    As a bit of context, I'm developing an ASP.NET MVC 5 application that uses OAuth-based authentication via Microsoft's OWIN implementation, for Facebook and Google only at this stage. Currently (as of v3.0.0, git-commit 4932c2f), the FacebookAuthenticationOptions and GoogleOAuth2AuthenticationOptions don't provide any property to force Facebook nor Google respectively to reauthenticate users (via appending the appropriate query string parameters) when signing in. Initially, I set out to override the following classes: FacebookAuthenticationOptions GoogleOAuth2AuthenticationOptions FacebookAuthenticationHandler (specifically AuthenticateCoreAsync()) GoogleOAuth2AuthenticationHandler (specifically AuthenticateCoreAsync()) yet discovered that the ~AuthenticationHandler classes are marked as internal. So I pulled a copy of the source for the Katana project (http://katanaproject.codeplex.com/) and modified the source accordingly. After compiling, I found that there are several dependencies that needed updating in order to use these updated assemblies (Microsoft.Owin.Security.Facebook and Microsoft.Owin.Security.Google) in the MVC project: Microsoft.Owin Microsoft.Owin.Security Microsoft.Owin.Security.Cookies Microsoft.Owin.Security.OAuth Microsoft.Owin.Host.SystemWeb This was done by replacing the existing project references to the 3.0.0 versions and updating those in web.config. Good news: the project compiles successfully. In debugging, I received an exception on startup: An exception of type 'System.IO.FileLoadException' occurred in [MVC web assembly].dll but was not handled in user code Additional information: Could not load file or assembly 'Microsoft.Owin.Security, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) The underlying exception indicated that Microsoft.AspNet.Identity.Owin was trying to load v2.1.0 of Microsoft.Owin.Security when calling app.UseExternalSignInCookie() from Startup.ConfigureAuth(IAppBuilder app) in Startup.Auth.cs. Unfortunately that assembly (and its other dependency, Microsoft.AspNet.Identity.Owin) aren't part of the Project Katana solution, and I can't find any accessible repository for these assemblies online. Are the Microsoft.AspNet.Identity assemblies open source, like the Katana project? Is there a way to fool those assemblies to use the referenced v3.0.0 assemblies instead of v2.1.0? The /bin folder contains the 3.0.0 versions of the Owin assemblies. I've upgraded the NuGet packages for Microsoft.AspNet.Identity.Owin, and this is still an issue. Any ideas on how to resolve this issue?

    Read the article

  • Multiple Exception Handlers for one exception type

    - by danish
    I am using Enterprose Library 4.1. I have created a custom exception handler called CustomHandler. This is how the configuration section would look like: <exceptionHandling> <exceptionPolicies> <add name="Exception Policy"> <exceptionTypes> <add type="System.Exception, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" postHandlingAction="NotifyRethrow" name="Exception"> <exceptionHandlers> <add type="WindowsFormsApplication1.CustomHandler, WindowsFormsApplication1" name="Custom Handler" /> <add exceptionMessage="Some test mesage." exceptionMessageResourceName="" exceptionMessageResourceType="" replaceExceptionType="Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.ExceptionHandlingException, Microsoft.Practices.EnterpriseLibrary.ExceptionHandling" type="Microsoft.Practices.EnterpriseLibrary.ExceptionHandling.ReplaceHandler, Microsoft.Practices.EnterpriseLibrary.ExceptionHandling" name="Replace Handler" /> </exceptionHandlers> </add> </exceptionTypes> </add> </exceptionPolicies> </exceptionHandling> There are two handlers for same exception type. What I want is that based on a certain condition one of the handlers should handle the exception. Any ideas how that can be done? Is there a way to call the other handler from inside the HandleException method of the custom handler based on some condition?

    Read the article

  • StructureMap Exception Code: 202 No Default Instance defined for PluginFamily

    - by Code Sherpa
    Hi. I am new to StructureMap. I have downloaded and am using version 2.6.1.0. I keep getting the below error: StructureMap Exception Code: 202 No Default Instance defined for PluginFamily Company.ProjectCore.Core.IConfiguration, Company.ProjectCore, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null My Global.asax.cs looks like: protected void Application_Start(object sender, EventArgs e) { var container = new Container(x => { x.For<ICache>().Use<Cache>(); x.For<IEmailService>().Use<EmailService>(); x.For<IUserSession>().Use<UserSession>(); x.For<IRedirector>().Use<Redirector>(); x.For<INavigation>().Use<Navigation>(); }); container.AssertConfigurationIsValid(); } I changed from ObjectFactory.Initialize to "new Container" to debug. When stepping through the AssertConfigurationIsValid() method, Cache works but EmailService fails at the GetInstance method in the following line: private readonly IConfiguration _configuration; public EmailService() { _configuration = ObjectFactory.GetInstance<IConfiguration>(); } If I remove IEmailService, the same 202 error is thrown at IUserSession. Should I be adding something else in Application_Start or in my class files? Thanks in advance...

    Read the article

  • bind a WPF datagrid to a datatable

    - by Jim Thomas
    I have used the marvelous example posted at: http://www.codeproject.com/KB/WPF/WPFDataGridExamples.aspx to bind a WPF datagrid to a datatable. The source code below compiles fine; it even runs and displays the contents of the InfoWork datatable in the wpf datagrid. Hooray! But the WPF page with the datagrid will not display in the designer. I get an incomprehensible error instead on my design page which is shown at the end of this posting. I assume the designer is having some difficulty instantiating the dataview for display in the grid. How can I fix that? XAML Code: xmlns:local="clr-namespace:InfoSeeker" <Window.Resources> <ObjectDataProvider x:Key="InfoWorkData" ObjectType="{x:Type local:InfoWorkData}" /> <ObjectDataProvider x:Key="InfoWork" ObjectInstance="{StaticResource InfoWorkData}" MethodName="GetInfoWork" /> </Window.Resources> <my:DataGrid DataContext="{Binding Source={StaticResource InfoWork}}" AutoGenerateColumns="True" ItemsSource="{Binding}" Name="dataGrid1" xmlns:my="http://schemas.microsoft.com/wpf/2008/toolkit" /> C# Code: namespace InfoSeeker { public class InfoWorkData { private InfoTableAdapters.InfoWorkTableAdapter infoAdapter; private Info infoDS; public InfoWorkData() { infoDS = new Info(); infoAdapter = new InfoTableAdapters.InfoWorkTableAdapter(); infoAdapter.Fill(infoDS.InfoWork); } public DataView GetInfoWork() { return infoDS.InfoWork.DefaultView; } } } Error shown in place of the designer page which has the grid on it: An unhandled exception has occurred: Type 'MS.Internal.Permissions.UserInitiatedNavigationPermission' in Assembly 'PresentationFramework, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' is not marked as serializable. at System.Runtime.Serialization.FormatterServices.InternalGetSerializableMembers(RuntimeType type) at System.Runtime.Serialization.FormatterServices.GetSerializableMembers(Type type, StreamingContext context) at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.InitMemberInfo() at System.Runtime.Serialization.Formatters.Binary.WriteObjectInfo.InitSerialize(Object obj, ISurrogateSelector surrogateSelector, StreamingContext context, SerObjectInfoInit serObjectInfoInit, IFormatterConverter converter, ObjectWriter objectWriter) ...At:Ms.Internal.Designer.DesignerPane.LoadDesignerView()

    Read the article

  • Webservices error for dev.virtualearth.net/webservices/geocode

    - by Xaisoft
    I am getting the following stacktrace and have no idea what I am looking at and how to debug and fix it: Here is the error: Description: An error occurred during the parsing of a resource required to service this request. Please review the following specific parse error details and modify your source file appropriately. Parser Error Message: Reference.svcmap: Failed to generate code for the service reference 'GeocodeService'. Cannot import wsdl:portType Detail: An exception was thrown while running a WSDL import extension: System.ServiceModel.Description.DataContractSerializerMessageContractImporter Error: Could not load file or assembly 'System.Xml, Version=2.0.5.0, Culture=neutral, PublicKeyToken=7cec85d7bea7798e' or one of its dependencies. The system cannot find the file specified. XPath to Error Source: //wsdl:definitions[@targetNamespace='http//dev.virtualearth.net /webservices/v1/geocode/contracts']/wsdl:portType[@name='IGeocodeService'] Cannot import wsdl:binding Detail: There was an error importing a wsdl:portType that the wsdl:binding is dependent on. XPath to wsdl:portType: //wsdl:definitions[@targetNamespace='http: //dev.virtualearth.net/webservices/v1/geocode/contracts'] /wsdl:portType[@name='IGeocodeService'] XPath to Error Source: //wsdl:definitions[@targetNamespace='http: //dev.virtualearth.net /webservices/v1/geocode']/wsdl:binding[@name='BasicHttpBinding_IGeocodeService'] Cannot import wsdl:port Detail: There was an error importing a wsdl:binding that the wsdl:port is dependent on. XPath to wsdl:binding: //wsdl:definitions[@targetNamespace='http://dev.virtualearth.net /webservices/v1/geocode']/wsdl:binding[@name='BasicHttpBinding_IGeocodeService'] XPath to Error Source: //wsdl:definitions[@targetNamespace='http://dev.virtualearth.net /webservices/v1/geocode']/wsdl:service[@name='GeocodeService'] /wsdl:port[@name='BasicHttpBinding_IGeocodeService'] Cannot import wsdl:binding Detail: There was an error importing a wsdl:portType that the wsdl:binding is dependent on. XPath to wsdl:portType: //wsdl:definitions[@targetNamespace='http: //dev.virtualearth.net/webservices/v1/geocode/contracts'] /wsdl:portType[@name='IGeocodeService'] XPath to Error Source: //wsdl:definitions[@targetNamespace='http: //dev.virtualearth.net /webservices/v1/geocode']/wsdl:binding[@name='CustomBinding_IGeocodeService'] Cannot import wsdl:port Detail: There was an error importing a wsdl:binding that the wsdl:port is dependent on. XPath to wsdl:binding: //wsdl:definitions[@targetNamespace='http://dev.virtualearth.net /webservices/v1/geocode']/wsdl:binding[@name='CustomBinding_IGeocodeService'] XPath to Error Source: //wsdl:definitions[@targetNamespace='http://dev.virtualearth.net /webservices/v1/geocode']/wsdl:service[@name='GeocodeService'] /wsdl:port[@name='CustomBinding_IGeocodeService']

    Read the article

  • AllowPartiallyTrustedCallersAttribute exception - unit testing with moq

    - by vdh_ant
    Hi guys I am receiving the following exception when trying to run my unit tests using .net 4.0 under VS2010 with moq 3.1. Attempt by security transparent method 'SPPD.Backend.DataAccess.Test.Specs_for_Core.When_using_base.Can_create_mapper()' to access security critical method 'Microsoft.VisualStudio.TestTools.UnitTesting.Assert.IsNotNull(System.Object)' failed. Assembly 'SPPD.Backend.DataAccess.Test, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' is marked with the AllowPartiallyTrustedCallersAttribute, and uses the level 2 security transparency model. Level 2 transparency causes all methods in AllowPartiallyTrustedCallers assemblies to become security transparent by default, which may be the cause of this exception. The test I am running is really straight forward and looks something like the following: [TestMethod] public void Can_create_mapper() { this.SetupTest(); var mockMapper = new Moq.Mock<IMapper>().Object; this._Resolver.Setup(x => x.Resolve<IMapper>()).Returns(mockMapper).Verifiable(); var testBaseDa = new TestBaseDa(); var result = testBaseDa.TestCreateMapper<IMapper>(); Assert.IsNotNull(result); //<<< THROWS EXCEPTION HERE Assert.AreSame(mockMapper, result); this._Resolver.Verify(); } I have no idea what this means and I have been looking around and have found very little on the topic. The closest reference I have found is this http://dotnetzip.codeplex.com/Thread/View.aspx?ThreadId=80274 but its not very clear on what they did to fix it... Anyone got any ideas?

    Read the article

  • Using Microsoft.Reporting.WebForms.ReportViewer in a custom SharePoint WebPart

    - by iHeartDucks
    I have a requirement where I have to display some data (from a custom db) and let the user export it to an excel file. I decided to use the ReportViewer control present in Microsoft.Reporting.WebForms.ReportViewer. I added the required assembly to the project references and I add the following code to test it out protected override void CreateChildControls() { base.CreateChildControls(); objRV = new Microsoft.Reporting.WebForms.ReportViewer(); objRV.ID = "objRV"; this.Controls.Add(objRV); } The first error asked me to add this line in the web.config which I did and the next error says The type 'Microsoft.SharePoint.Portal.Analytics.UI.ReportViewerMessages, Microsoft.SharePoint.Portal, Version=12.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c' does not implement IReportViewerMessages Is it possible to use ReportViewer in my custom Web Part? I rather not bind a repeater and write my own export to excel code. I want to use something which is already built by Microsoft? Any ideas on what I can reuse? Edit I commented the following line <add key="ReportViewerMessages"... and now my code looks like this after I added a data source to it protected override void CreateChildControls() { base.CreateChildControls(); objRV = new Microsoft.Reporting.WebForms.ReportViewer(); objRV.ID = "objRV"; objRV.Visible = true; Microsoft.Reporting.WebForms.ReportDataSource datasource = new Microsoft.Reporting.WebForms.ReportDataSource("test", GroupM.Common.DB.GetAllClientCodes()); objRV.LocalReport.DataSources.Clear(); objRV.LocalReport.DataSources.Add(datasource); objRV.LocalReport.Refresh(); this.Controls.Add(objRV); } but now I do not see any data on the page. I did check my db call and it does return a data table with 15 rows. Any ideas why I don't see anything on the page?

    Read the article

  • Installing into the GAC with WiX 3.0

    - by Jeff Yates
    I have a DLL that I would like to install into the Global Assembly Cache so that it can be referenced from multiple locations. I have a File declaration with the Assembly attribute set to ".net" but when the installation tries to install the DLL into the GAC, I get the following error (I have tided it up a bit to make it more readable): MSI (s) (58:38) [19:14:31:031]: Product: MyProductName 1.01 -- Error 1935. An error occurred during the installation of assembly  'Compass,   version="1.0.0.0",   culture="neutral",   publicKeyToken="392B26B760D48103",   processorArchitecture="MSIL"'. Please refer to Help and Support for more information. HRESULT: 0x80131043. assembly interface:       IAssemblyCacheItem, function:             Commit, component: {53AEE63B-F356-4D4F-8D61-EB0640A6E160} I have hunted around to find out what this means and the error relates to FUSION_E_UNEXPECTED_MODULE_FOUND. This link also includes this information: /// When installing multi-file assemblies into the GAC, the hash of each module is /// checked against the hash of that file stored in the manifest. If the /// hash of one of the files in the multi-file assembly does not match what is recorded /// in the manifest, FUSION_E_UNEXPECTED_MODULE_FOUND will be returned. /// The name of the error, and the text description of it, are somewhat confusing. /// The reason this error code is described this way is that the internally, /// Fusion/CLR implements installation of assemblies in the GAC, by installing /// multiple "streams" that are individually committed. /// Each stream has its hash computed, and all the hashes found /// are compared against the hashes in the manifest, at the end of the installation. /// Hence, a file hash mismatch appears as if an "unexpected" module was found. Unfortunately, this doesn't make much sense to me and I don't see how it relates to my assembly, which isn't fancy or complex from my perspective (it's just a regular .NET 3.5 class library and the current installation test is occurring on my development machine, which is a valid target environment for my project - 32-bit Windows XP SP3). Can anyone shed some light on why I might be getting this error and how I might hope to fix it?

    Read the article

  • Nhibernate exception - No persister for

    - by Muhammad Akhtar
    I am getting exception when calling Stored Procedure using Nhibernate and here is the exception No persister for: ReleaseDAL.ProgressBars, ReleaseDAL, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null here is class file public class ProgressBars { public ProgressBars() { } private Int32 _Tot; private Int32 _subtot; public virtual Int32 Tot {get { return _Tot; } set { _Tot = value; } } public virtual Int32 subtot { get { return _subtot; } set { _subtot = value; }} } here is my mapping file <hibernate-mapping xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="urn:nhibernate-mapping-2.2" assembly="ReleaseDAL" namespace="ReleaseDAL" > <sql-query name="ps_getProgressBarData1" > <return alias="ProgressBars" class="ProgressBars"> <return-property name="Tot" column="Tot"/> <return-property name="subtot" column="subtot"/> </return> exec ps_getProgressBarData1 </sql-query> </hibernate-mapping> Calling Code public List<ProgressBars> getProgressBarData1(Guid ReleaseId) { ISession session = NHibernateHelper.GetCurrentSession(); List< ProgressBars> progressBar = (List<ProgressBars>)session.GetNamedQuery("ps_getProgressBarData1").List<ProgressBars>(); NHibernateHelper.CloseSession(); return progressBar; } I am introuble due to this exception, I did lot Google but no success, Please give idea to solve this problem. Thanks.........

    Read the article

  • applicationSettings and Web.config

    - by Eric J.
    I have a DLL that provides logging that I use for WebForms projects and now wish to use it in an ASP.Net MVC 2 project. Some aspects of that DLL are configured in app.config: <configuration> <configSections> <section name="Tools.Instrumentation.Properties.Settings" type="System.Configuration.ClientSettingsSection, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" requirePermission="false" /> </sectionGroup> </configSections> <applicationSettings> <Tools.Instrumentation.Properties.Settings> <setting name="LogLevel" serializeAs="String"> <value>DEBUG</value> </setting> <setting name="AppName" serializeAs="String"> <value>MyApp</value> </setting> <setting name="Port" serializeAs="String"> <!--value>33333</value--> <value>0</value> </setting> </Tools.Instrumentation.Properties.Settings> </configuration> However, when I create a similar entry in Web.config, I get the error: Unrecognized configuration section applicationSettings My two-part question: How do I make this config entry work in Web.config? Where can I read up on the conceptual differences between WinForms configuration and ASP.Net configuration?

    Read the article

  • Unknown Build Error using WPF Toolkit

    - by Tom Allen
    I installed the Feb 2010 WPF Toolkit as I'm interested in evaluating the AutoCompleteBox control and I'm having extremely limited success. I can get the control to work, but as soon as I try and set any of it's properties in XAML, I get the following: Unknown build error, 'Cannot resolve dependency to assembly 'WPFToolkit, Version=3.5.40128.1, Culture=neutral, PublicKeyToken=31bf3856ad364e35' because it has not been preloaded. When using the ReflectionOnly APIs, dependent assemblies must be pre-loaded or loaded on demand through the ReflectionOnlyAssemblyResolve event. I've been testing this on a blank WPF window in a new solution. I'm guessing I'm just missing a reference or something... Here's the XAML (I've added nothing to the .xaml.cs): <Window x:Class="WpfToolkitApplication.Window1" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:toolkit="clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Input.Toolkit" Title="Window1" Height="300" Width="300"> <Grid> <toolkit:AutoCompleteBox Height="25"/> </Grid> </Window> The only reference I've added is System.Windows.Controls.Input.Toolkit. Any ideas?

    Read the article

  • Pass value of a field to Silverlight ConverterParameter

    - by eidylon
    Hi all, I'm writing my very first Silverlight app. I have a datagrid with a column that has two labels, for the labels, i am using an IValueConverter to conditionally format the data. The label's "Content" is set as such: Content="{Binding HomeScore, Converter={StaticResource fmtshs}}" and Content="{Binding AwayScore, Converter={StaticResource fmtshs}}" The Convert method of my IValueConverter is such: Public Function Convert( ByVal value As Object, ByVal targetType As System.Type, ByVal parameter As Object, ByVal culture As System.Globalization.CultureInfo) As Object Implements System.Windows.Data.IValueConverter.Convert Dim score As Long = value, other As Long = parameter Return If(score < 0, "", If(score - other > 5, (other + 5).ToString, score.ToString) ) End Function So what i want to do is in the converter for HomeScore, i want to pass AwayScore to the ConverterParameter, and for AwayScore i want to pass the HomeScore to the converter. In the converter for either score i need to be able to know the value of the other score for formatting purposes. But i cannot figure out the syntax for binding the ConverterParameter to another field. I've tried the following: Content="{Binding HomeScore, Converter={StaticResource fmtshs}, ConverterParameter=AwayScore}" Content="{Binding HomeScore, Converter={StaticResource fmtshs}, ConverterParameter={AwayScore}}" Content="{Binding HomeScore, Converter={StaticResource fmtshs}, ConverterParameter={Binding AwayScore}}" But none of those seem to work. How do i pass a field value to the ConverterParameter?

    Read the article

  • MSBuild: building website using AspNetCompiler - adding references?

    - by Tom Morgan
    Hi, I'm attempting to build a ASP.NET website using MSBuild - specifically the AspNetCompiler tag. I know that, for my project, I need to add some references. Within Visual Studio I have several references, one is a project reference and the others are some DLLS (AjaxControlToolkit etc). I'm happy not referencing the project and referencing the DLL instead - however I just can't work out how to add a reference. I've looked up and down and this is what I've found so far: <Target Name = "PrecompileWeb"> <AspNetCompiler VirtualPath = "DeployTemp" PhysicalPath = "D:\AutoBuild\CruiseControl\Projects\Websites\MyCompany\2.0.0\WorkingDirectory\VSS" TargetPath = "D:\AutoBuild\CruiseControl\Projects\Websites\MyCompany\2.0.0\PreCompiled" Force = "true" Debug = "true" Updateable = "true"/> </Target> Also - I've picked up this bit of code from around the web somewhere, which I thought might help: <ItemGroup> <Reference Include="My.Web.DataEngine, Culture=neutral, processorArchitecture=MSIL"> <SpecificVersion>False</SpecificVersion> <HintPath>D:\AutoBuild\CruiseControl\Projects\Components\My.Web.DataEngine\bin\Debug\My.Web.DataEngine.dll</HintPath> </Reference> </ItemGroup> What I want to do is add a attribute to the AspNetCompiler tag, something like: References="@(Reference)" but MSBuild isn't very happy about this. I've been a bit stuck in not being able to find decent references on doing this anywhere: so I'd really apprechiate some pointers or reference material etc. (or just the answer!) Thanks for you help. -tom

    Read the article

  • Moq broken? Not working with .net 4.0

    - by vdh_ant
    Hi guys I am receiving the following exception when trying to run my unit tests using .net 4.0 under VS2010 with moq 3.1. Attempt by security transparent method 'SPPD.Backend.DataAccess.Test.Specs_for_Core.When_using_base.Can_create_mapper()' to access security critical method 'Microsoft.VisualStudio.TestTools.UnitTesting.Assert.IsNotNull(System.Object)' failed. Assembly 'SPPD.Backend.DataAccess.Test, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' is marked with the AllowPartiallyTrustedCallersAttribute, and uses the level 2 security transparency model. Level 2 transparency causes all methods in AllowPartiallyTrustedCallers assemblies to become security transparent by default, which may be the cause of this exception. The test I am running is really straight forward and looks something like the following: [TestMethod] public void Can_create_mapper() { this.SetupTest(); var mockMapper = new Moq.Mock<IMapper>().Object; this._Resolver.Setup(x => x.Resolve<IMapper>()).Returns(mockMapper).Verifiable(); var testBaseDa = new TestBaseDa(); var result = testBaseDa.TestCreateMapper<IMapper>(); Assert.IsNotNull(result); //<<< THROWS EXCEPTION HERE Assert.AreSame(mockMapper, result); this._Resolver.Verify(); } I have no idea what this means and I have been looking around and have found very little on the topic. The closest reference I have found is this http://dotnetzip.codeplex.com/Thread/View.aspx?ThreadId=80274 but its not very clear on what they did to fix it... Anyone got any ideas?

    Read the article

  • Gacutil.exe successfully adds assembly, but assembly not viewable in explorer. Why?

    - by Ben McCormack
    I'm running GacUtil.exe from within Visual Studio Command Prompt 2010 to register a dll (CatalogPromotion.dll) to the GAC. After running the utility, it says Assembly Successfully added to the cache, and running gacutil /l CatalogPromotionDll shows that the GAC contains the assembly, but I can't see the assembly when I navigate to C:\WINDOWS\assembly from Windows Explorer. Why can't I see the assembly in WINDOWS\assembly from Windows Explorer but I can see it using gacutil.exe? Background: Here's what I typed into the command prompt for VS Tools: C:\_Dev Projects\VS Projects\bmccormack\CatalogPromotion\CatalogPromotionDll\bin \Debuggacutil /i CatalogPromotionDll.dll Microsoft (R) .NET Global Assembly Cache Utility. Version 4.0.30319.1 Copyright (c) Microsoft Corporation. All rights reserved. Assembly successfully added to the cache C:\_Dev Projects\VS Projects\bmccormack\CatalogPromotion\CatalogPromotionDll\bin \Debuggacutil /l CatalogPromotionDll Microsoft (R) .NET Global Assembly Cache Utility. Version 4.0.30319.1 Copyright (c) Microsoft Corporation. All rights reserved. The Global Assembly Cache contains the following assemblies: CatalogPromotionDll, Version=1.0.0.0, Culture=neutral, PublicKeyToken=9188a175 f199de4a, processorArchitecture=MSIL Number of items = 1 However, the assembly doesn't show up in C:\WINDOWS\assembly.

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >