Search Results

Search found 4914 results on 197 pages for 'iis 7 5'.

Page 186/197 | < Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >

  • Visual Studio crashes consistently on web-related projects

    - by Traveling Tech Guy
    Hi, I have a brand new VS2010 installed on a Win2008R2 machine. I started getting this error when debugging a WCF service project: "Attempted to read or write protected memory. This is often an indication that other memory is corrupt." When I started developing a web site a week later, this became consistent - I can't debug it. The stack dump reads: at Microsoft.VisualStudio.WebHost.Host.ProcessRequest(Connection conn) at Microsoft.VisualStudio.WebHost.Server.OnSocketAccept(Object acceptedSocket) at System.Threading.QueueUserWorkItemCallback.WaitCallback_Context(Object state) at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean ignoreSyncCtx) at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem() at System.Threading.ThreadPoolWorkQueue.Dispatch() at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback() I tried searching online, and some recommend turning off the "Suppress JIT Optimizations" in the Debugging options - this dos not seem to make a difference. Clearly the problem is with the built in web server. But am I doing something wrong? Is there something I can do? Or is this a known bug? Thanks for your time, Guy Update 12/31: Today I tried using CassiniDev as a replacement to the original VS2010 WebServer - exact same result. My suspicion is that there's some internal conflict between VS2010, Windows Server 2008R2 and maybe the fact that it's a 64 bit OS. I switched to using IIS as my debug server - and that seems to work, with some annoying side effects. My conclusion: do not use a 64 bit server system as your dev machine. Develop on 32bit - deploy to 64bit. Side conclusion: there are some scenarios Microsoft's QA doesn't test.

    Read the article

  • Why isn't the Cache invalidated after table update using the SqlCacheDependency?

    - by Jason
    I have been trying to get SqlCacheDependency working. I think I have everything set up correctly, but when I update the table, the item in the Cache isn't invalidated. Can you look at my code and see if I am missing anything? I enabled the Service Broker for the Sandbox database. I have placed the following code in the Global.asax file. I also restart IIS to make sure it is called. void Application_Start(object sender, EventArgs e) { SqlDependency.Start(ConfigurationManager.ConnectionStrings["SandboxConnectionString"].ConnectionString); } I have placed this entry in the web.config file: <system.web> <caching> <sqlCacheDependency enabled="true" pollTime="10000"> <databases> <add name="Sandbox" connectionStringName="SandboxConnectionString"/> </databases> </sqlCacheDependency> </caching> </system.web> I call this code to put the item into the cache: protected void CacheDataSetButton_Click(object sender, EventArgs e) { using (SqlConnection sqlConnection = new SqlConnection(ConfigurationManager.ConnectionStrings["SandboxConnectionString"].ConnectionString)) { using (SqlCommand sqlCommand = new SqlCommand("SELECT PetID, Name, Breed, Age, Sex, Fixed, Microchipped FROM dbo.Pets", sqlConnection)) { using (SqlDataAdapter sqlDataAdapter = new SqlDataAdapter(sqlCommand)) { DataSet petsDataSet = new DataSet(); sqlDataAdapter.Fill(petsDataSet, "Pets"); SqlCacheDependency petsSqlCacheDependency = new SqlCacheDependency(sqlCommand); Cache.Insert("Pets", petsDataSet, petsSqlCacheDependency, DateTime.Now.AddSeconds(10), Cache.NoSlidingExpiration); } } } } Then I bind the GridView with this code: protected void BindGridViewButton_Click(object sender, EventArgs e) { if (Cache["Pets"] != null) { GridView1.DataSource = Cache["Pets"] as DataSet; GridView1.DataBind(); } } Between attempts to DataBind the GridView, I change the table's values expecting it to invalidate the Cache["Pets"] item, but it seems to stay in the Cache indefinitely.

    Read the article

  • RewriteCond and RewriteRule newbie

    - by mybrokengnome
    I'm taking over a website for a client that is running on a custom built CMS (that I didn't write). I don't mess with .htaccess files usually because a lot of the hosting I do is on IIS, or I used WordPress as a CMS and don't have to worry about messing with the .htaccess file. Here's the contents of the file: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ framework.php?%{QUERY_STRING}&resource=$1& [L] I get what it's doing (sending all requests through the framework.php file). The client wants a WordPress blog added to their site. I'm placing it in a /blog/ folder. The problem is that because of the rewrite rules and conditions in the .htaccess file whenever I try to go /blog/ the other CMS freaks out because it doesn't like me trying to go there. My question is how do I write a rule/cond that tells apache to send all requests made to the /blog/ folder to the /blog/ folder, but keep all other requests piped through the framework.php file like it is now? Any help is appreciated, thanks!

    Read the article

  • (500) Internal Server Error with C# and Web Dev 2008 Express

    - by user32848
    The code below is generic, found in a variety of places, including a book I have. I have used it as a base for a working program in VS 2005. Now I've resurrected it with my current Visual Web Developer 2008 Express Edition and I seem to have problems connecting it to my default development server (I don't have IIS on my XP). The error is: (500) Internal Server Error. Is this saying what I thought it did (above) or something else, and how do I solve this problem? using System; using System.Collections; using System.Collections.Generic; using System.Configuration; using System.IO; using System.Net; using System.Linq; using System.Web; using System.Web.Security; using System.Web.UI; using System.Web.UI.WebControls; using System.Web.UI.WebControls.WebParts; using System.Web.UI.HtmlControls; using System.Text.RegularExpressions; public partial class _Default : System.Web.UI.Page { protected void Page_Load(object sender, EventArgs e) { string strResult = ""; string url = "http://weather.unisys.com"; WebResponse objResponse; WebRequest objRequest; try { objRequest = System.Net.HttpWebRequest.Create(url); } catch { objRequest = System.Net.HttpWebRequest.Create("http://"+ url); } objResponse = objRequest.GetResponse(); using (StreamReader sr = new StreamReader(objResponse.GetResponseStream())) { strResult = sr.ReadToEnd(); sr.Close(); } } }

    Read the article

  • Replacement for Hamachi for SVN access

    - by Piers
    My company has been using Hamachi to access our SVN repository for a number of years. We are a small yet widely distributed development team with each programmer in a different country working from home. The server is hosted by a non-techie in our central office. Hamachi is useful here since it has a GUI and supports remote management. This system worked well for a while, but recently I have moved to a country with poor internet speeds. Hamachi will no longer connect 99% of the time - instead I get a "Probing..." message that doesn't resolve. It's certain to be a latency issue, as the same laptop will connect without problems when I cross the border and connect using a different ISP with better speeds. So I really need to replace Hamachi with some other VPN/protocol that handles latency better. The techie managing the repository is not comfortable installing and configuring Apache or IIS, so it looks like HTTP is out. I tried to convince my boss to go for a web hosting company, but he doesn't trust a 3rd party with our source. Any other recommended options / experiences out there for accessing our SVN repos that would be as simple as Hamachi for setup; but be more tolerant of network latency issues?

    Read the article

  • HTTP Compression problems on IIS7

    - by Jonathan Wood
    I've spent quite a bit of time on this but seem to be going nowhere. I have a large page that I really want to speed up. The obvious place to start seems to be HTTP compression, but I just can't seem to get it to work for me. After considerable searching, I've tried several variations of the code below. It kind of works, but after refreshing the browser, the results seem to fall apart. They were turning to garbage when the page used caching. If I turn off caching, then the page seems right but I lose my CSS formatting (stored in a separate file) and get an error that an included JS file contains invalid characters. Most of the resources I've found on the Web were either very old or focused on accessing IIS directly. My page is running on a shared hosting account and I do not have direct access to IIS7, which it's running on. protected void Application_BeginRequest(object sender, EventArgs e) { // Implement HTTP compression if (Request["HTTP_X_MICROSOFTAJAX"] == null) // Avoid compressing AJAX calls { // Retrieve accepted encodings string encodings = Request.Headers.Get("Accept-Encoding"); if (encodings != null) { // Verify support for or gzip (deflate takes preference) encodings = encodings.ToLower(); if (encodings.Contains("gzip") || encodings == "*") { Response.Filter = new GZipStream(Response.Filter, CompressionMode.Compress); Response.AppendHeader("Content-Encoding", "gzip"); Response.Cache.VaryByHeaders["Accept-encoding"] = true; } else if (encodings.Contains("deflate")) { Response.Filter = new DeflateStream(Response.Filter, CompressionMode.Compress); Response.AppendHeader("Content-Encoding", "deflate"); Response.Cache.VaryByHeaders["Accept-encoding"] = true; } } } } Is anyone having better success with this?

    Read the article

  • AppFabric caching's local cache isnt working for us... What are we doing wrong?

    - by Olly
    We are using appfabric as the 2ndlevel cache for an NHibernate asp.net application comprising a customer facing website and an admin website. They are both connected to the same cache so when admin updates something, the customer facing site is updated. It seems to be working OK - we have a CacheCLuster on a seperate server and all is well but we want to enable localcache to get better performance, however, it dosnt seem to be working. We have enabled it like this... bool UseLocalCache = int LocalCacheObjectCount = int.MaxValue; TimeSpan LocalCacheDefaultTimeout = TimeSpan.FromMinutes(3); DataCacheLocalCacheInvalidationPolicy LocalCacheInvalidationPolicy = DataCacheLocalCacheInvalidationPolicy.TimeoutBased; if (UseLocalCache) { configuration.LocalCacheProperties = new DataCacheLocalCacheProperties( LocalCacheObjectCount, LocalCacheDefaultTimeout, LocalCacheInvalidationPolicy ); // configuration.NotificationProperties = new DataCacheNotificationProperties(500, TimeSpan.FromSeconds(300)); } Initially we tried using a timeout invalidation policy (3mins) and our app felt like it was running faster. HOWEVER, we noticed that if we changed something in the admin site, it was immediatley updated in the live site. As we are using timeouts not notifications, this demonstrates that the local cache isnt being queried (or is, but is always missing). The cache.GetType().Name returns "LocalCache" - so the factory has made a local cache. Running "Get-Cache-Statistics MyCache" in PS on my dev environment (asp.net app running local from vs2008, cache cluster running on a seperate w2k8 machine) show a handful of Request Counts. However, on the Production environment, the Request Count increases dramaticaly. We tried following the method here to se the cache cliebt-server traffic... http://blogs.msdn.com/b/appfabriccat/archive/2010/09/20/appfabric-cache-peeking-into-client-amp-server-wcf-communication.aspx but the log file had nothing but the initial header in it - i.e no loggin either. I cant find anything in SO or Google. Have we done something wrong? Have we got a screwy install of AppFabric - we installed it via WebPlatform Installer - I think? (note: the IIS box running ASp.net isnt in yhe cluster - it is just the client). Any insights greatfully received!

    Read the article

  • Identity.Name is disposed in a IIS7 Asp.NET MVC application Thread

    - by vIceBerg
    I have made the smallest demo project to illustrate my problem. You can download the sources Here Visual Studio 2008, .NET 3.5, IIS7, Windows 7 Ultimate 32 bits. The IIS Website is configured ONLY for Windows Authentication in an Integreated pipeline app pool (DefaultAppPool). Here's the problem. I have an Asp.NET MVC 2 application. In an action, I start a thread. The View returns. The thread is doing it's job... but it needs to access Thread.CurrentPrincipal.Identity.Name BANG The worker process of IIS7 stops. I have a window that says: "Visual Studio Just-In-Time Debugger An unhandled exception ('System.Object.DisposedException') occured in w3wp.exe [5524]" I checked with the debugger and the Thread.CurrentPrincipal.Identity is valid, but the Name property is disposed. If I put a long wait in the action before it returns the view, then the Thread can do it's job and the Identity.Name is not disposed. So I think the Name gets disposed when the view is returned. For the sake of the discussion, here's the code that the thread runs (but you can also download the demo project. The link is on top of this post): private void Run() { const int SECTOWAIT = 3; //wait SECTOWAIT seconds long end = DateTime.Now.Ticks + (TimeSpan.TicksPerSecond * SECTOWAIT); while (DateTime.Now.Ticks <= end) continue; //Check the currentprincipal. BANG!!!!!!!!!!!!! var userName = Thread.CurrentPrincipal.Identity.Name; } Here's the code that starts the thread public void Start() { Thread thread = new Thread(new ParameterizedThreadStart(ThreadProc)); thread.SetApartmentState(ApartmentState.MTA); thread.Name = "TestThread"; thread.Start(this); } static void ThreadProc(object o) { try { Builder builder = (Builder)o; builder.Run(); } catch (Exception ex) { throw; } } So... what am i doing wrong? Thanks

    Read the article

  • VS2010 error: Unable to start debugging on the web server

    - by GarDavis
    I get error message "Unable to start debugging on the web server" in Visual Studio 2010. I clicked the Help button and followed the related suggestions without success. This happens with a newly created local ASP.Net project when modified to use IIS instead of Cassini (which works for debugging). It prompts to set debug="true" in the web.config and then immediately pops up the error. Nothing shows up in the Event Viewer. I am able to attach to w3wp to debug. It works but is not as convenient as F5. I also have a similar problem with VS2008 on the same PC. Debugging used to work for both. I have re-registered Framework 4 (aspnet_regiis -i). I ran the VS2010 repair (this is the RTM version). I am running on a Windows Server 2008 R2 x64 box. I do have Resharper V5 installed. There must be some configuration setting or registry value that survives the repair causing the problem. I'd appreciate any ideas. Thanks, Gary Davis ([email protected])

    Read the article

  • WCF cross-domain policy security error

    - by George2
    Hello everyone, I am using VSTS 2008 + C# + WCF + .Net 3.5 + Silverlight 3.0. I host Silverlight control in an html page and debug it from VSTS 2008 (press F5, then run in VSTS 2008 built-in ASP.Net development web server), then call another WCF service (hosted in another machine running IIS 7.0 + Vista). The WCF service is very simple, just return a constant string to client. When invoking the WCF service from Silverlight, I got the following error message, An error occurred while trying to make a request to URI 'https://LabTest/Test.svc'. This could be due to attempting to access a service in a cross-domain way without a proper cross-domain policy in place, or a policy that is unsuitable for SOAP services. You may need to contact the owner of the service to publish a cross-domain policy file and to ensure it allows SOAP-related HTTP headers to be sent. This error may also be caused by using internal types in the web service proxy without using the InternalsVisibleToAttribute attribute. Please see the inner exception for more details. Here is the clientaccesspolicy.xml file, anything wrong? <?xml version="1.0" encoding="utf-8" ?> <access-policy> <cross-domain-access> <policy> <allow-from http-request-headers="*"> <domain uri="*"> </domain> </allow-from> <grant-to> <resource path="/" include-subpaths="true"></resource> </grant-to> </policy> </cross-domain-access> </access-policy> thanks in advance, George

    Read the article

  • ASP.NET MVC 2 user input to SQL 2008 Database problems

    - by Rob
    After my publish in VS2010 the entire website loads and pulls data from the database perfectly. I can even create new users through the site with the correct key code, given out to who needs access. I have two connection strings in my web.config file The first: <add xdt:Transform="SetAttributes" xdt:Locator="Match(name)" name="EveModelContainer" connectionString="metadata=res://*/Models.EdmModel.EveModel.csdl|res://*/Models.EdmModel.EveModel.ssdl|res://*/Models.EdmModel.EveModel.msl;provider=System.Data.SqlClient;provider connection string=&quot;Data Source=localhost;Initial Catalog=fleet;Persist Security Info=True;User ID=fleet;Password=****&quot;" providerName="System.Data.EntityClient" /> The second: <add xdt:Transform="SetAttributes" xdt:Locator="Match(name)" name="ApplicationServices" connectionString="Data Source=localhost;Initial Catalog=fleet;Persist Security Info=True;User ID=fleet;Password=****;MultipleActiveResultSets=True" /> The first one is the one that is needed to post data with the main application, EveModelContainer. Everything else is pulled using the standard ApplicationServices connection. Do you see anything wrong with my connectionstring? I'm at a complete loss here. The site works perfectly on my friends server and not on mine... Could it be a provider issue? And if I go to iis 7's manager console, and click .net users I get a pop up message saying the custom provider isn't a trusted provider do I want to allow it to run at a higher trust level. I'm at the point where I think its either my string or this trusted provider error... but I have no clue how to add to the trusted provider list... Thank you in advance!!!

    Read the article

  • SIMPLE PHP MVC Framework [closed]

    - by Allen
    I need a simple and basic MVC example to get me started. I don't want to use any of the available packaged frameworks. I am in need of a simple example of a simple PHP MVC framework that would allow, at most, the basic creation of a simple multi-page site. I am asking for a simple example because I learn best from simple real world examples. Big popular frameworks (such as code igniter) are to much for me to even try to understand and any other "simple" example I have found are not well explained or seem a little sketchy in general. I should add that most examples of simple MVC frameworks I see use mod_rewrite (for URL routing) or some other Apache-only method. I run PHP on IIS. I need to be able to understand a basic MVC framework, so that I could develop my own that would allow me to easily extend functionality with classes. I am at the point where I understand basic design patterns and MVC pretty well. I understand them in theory, but when it comes down to actually building a real world, simple, well designed MVC framework in PHP, I'm stuck. Edit: I just want to note that I am looking for a simple example that an experienced programmer could whip up in under an hour. I mean simple as in bare bones simple. I don't want to use any huge frameworks, I am trying to roll my own. I need a decent SIMPLE example to get me going.

    Read the article

  • What could cause a PHP error on an include statement?

    - by J Jones
    I've got a bug in my PHP code that's been terrorizing me for several days now. I'm trying to clasp in a new module to an existing Magento (v1.4) site, though I'm very new to the Magento framework. I think I am pretty close to getting to "Hello, World" on a block that I want displayed in the backend, but I'm getting a 500 error when the menu item is selected. I managed to track it down (using echo stmts) to a line in the Layout.php file (app\code\core\Mage\Core\Model\Layout.php, line 472ish): if (class_exists($block, false) || mageFindClassFile($block)) { $temp = $block; echo "<p>before constructor: $temp</p>"; $block = new $block($attributes); echo "<p>after constructor: $temp</p>"; } For my block, this yields only "before constructor...", so I know this is what is failing. A little more debugging reveals that the class in $block (the new block I am trying to show) does not exist. I would have expected the __autoload function to take care of this, but none of my echos in __autoload are displaying. As a last ditch effort, I tried an include statement in Mage.php to the absolute location of the block class, but similar before and after echos reveal that that include statement becomes the breaking line. I'm tempted to start thinking "permissions", but I'm not well versed in the server management side of all this, and I have limited access to the test server (the test server belongs to the client). To anticipate the question: there are no errors reported in the PHP log file. I am actually not convinced that this site is reporting errors to the log file (I haven't seen anything from this site), though the client is certain that everything is turned on. IIS 7. Integrated mode, I'm pretty sure. Anyone know what could be causing this?

    Read the article

  • error while reading Excel sheet

    - by Lalit
    Hi, I have code to read Excel from c3 language : DataTable dtChildrenData = new DataTable(); OdbcConnection oConn = null; try { if (File.Exists(strSheetPath)) { oConn = new OdbcConnection(); oConn.ConnectionString = @"DSN=Excel Files;DBQ=" + strSheetPath + @";DriverId=1046;FIL=excel 12.0;MaxBufferSize=2048;PageTimeout=5;"; OdbcCommand oComm = new OdbcCommand(); oComm.Connection = oConn; oComm.CommandText = "Select * From [Sheet1$]"; DataSet ds = new DataSet(); OdbcDataAdapter oAdapter = new OdbcDataAdapter(oComm); oConn.Open(); oAdapter.Fill(ds); dtChildrenData = ds.Tables[0]; } } finally { oConn.Close(); } return dtChildrenData; But getting this error when i deploy the web application on IIS. Wherere as it is running fine locally. ERROR [IM002] [Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified How to solve this. Please let me know if any information required to answer this question (about configuration)

    Read the article

  • Stopping cookies being set from a domain (aka "cookieless domain") to increase site performance

    - by Django Reinhardt
    I was reading in Google's documentation about improving site speed. One of their recommendations is serving static content (images, css, js, etc.) from a "cookieless domain": Static content, such as images, JS and CSS files, don't need to be accompanied by cookies, as there is no user interaction with these resources. You can decrease request latency by serving static resources from a domain that doesn't serve cookies. Google then says that the best way to do this is to buy a new domain and set it to point to your current one: To reserve a cookieless domain for serving static content, register a new domain name and configure your DNS database with a CNAME record that points the new domain to your existing domain A record. Configure your web server to serve static resources from the new domain, and do not allow any cookies to be set anywhere on this domain. In your web pages, reference the domain name in the URLs for the static resources. This is pretty straight forward stuff, except for the bit where it says to "configure your web server to serve static resources from the new domain, and do not allow any cookies to be set anywhere on this domain". From what I've read, there's no setting in IIS that allows you to say "serve static resources", so how do I prevent ASP.NET from setting cookies on this new domain? At present, even if I'm just requesting a .jpg from the new domain, it sets a cookie on my browser, even though our application's cookies are set to our old domain. For example, ASP.NET sets an ".ASPXANONYMOUS" cookie that (as far as I'm aware) we're not telling it to do. Apologies if this is a real newb question, I'm new at this! Thanks.

    Read the article

  • How can I share an entity framework model across website users

    - by richardmoss
    Hello, Currently my website is based around MVC and the Entity Framework running against a SQL Server 2005 database. So far, it has all been running very smoothly, and I really enjoy MVC and its slimmer more concise code (and no huge viewstates or soul destroying postbacks ;)) Recently I was working on upgrading the site to use a simple forum system, and this is where I started running into problems. When I was testing the site using two different browsers, if I created or replied to a post in one browser, the other browser couldn't see the post. At the moment, each visitor to the site gets their own copy of the entity model, which I store in their session data. Obviously this is the problem as updates to one model aren't getting carried to the other. As a test, I tried storing a single copy of the model which all visitors would access by assigning the model to a static variable. This worked, and both browsers could see each others modifications. However, it had its side effects. For example, if I fired up both browsers at the same time and the model was initialized, one browser would crash, and the other would work fine, despite me using a locking object so in theory one of them should have been delayed until the model was ready (of course I could have implemented this wrong ;)). Also, originally this site did use one model for all visitors and when it was live, it frequently shut down - killing the IIS application pool while it did. Now I'm not sure if this was related, but I don't really want to reintroduce whatever bug I had that caused this shut down. So, my question is a simple one really - what is the best way of either using the same model for all website users so they all see updates, or if they do have separate copies (which I imagine will have a performance impact in time) how can the models detect changes in the database and update themselves according. Thanks in advance for any advice! Regards; Richard Moss

    Read the article

  • What database strategy to choose for a large web application

    - by Snoopy
    I have to rewrite a large database application, running on 32 servers. The hardware is up to date, each machine has two quad core Xeon and 32 GByte RAM. The database is multi-tenant, each customer has his own file, around 5 to 10 GByte each. I run around 50 databases on this hardware. The app is open to the web, so I have no control on the load. There are no really complex queries, so SQL is not required if there is a better solution. The databases get updated via FTP every day at midnight. The database is read-only. C# is my favourite language and I want to use ASP.NET MVC. I thought about the following options: Use two big SQL servers running SQL Server 2012 to serve the 32 servers with data. On the 32 servers running IIS hosting providing REST services. Denormalize the database and use Redis on each webserver. Use booksleeve as a Redis client. Use a combination of SQL Server and Redis Use SQL Server 2012 together with Hadoop Use Hadoop without SQL Server What is the best way for a read-only database, to get the best performance without loosing maintainability? Does Map-Reduce make sense at all in such a scenario? The reason for the rewrite is, the old app written in C++ with ISAM technology is too slow, the interfaces are old fashioned and not nice to use from an website, especially when using ajax. The app uses a relational datamodel with many tables, but it is possible to write one accerlerator table where all queries can be performed on, and all other information from the other tables are possible by a simple key lookup.

    Read the article

  • DotNetOpeId - Using OpenIdButton for Google Apps

    - by JediPotPie
    I am an ASP.NET newbie and I am trying to design an OpenID/SSO system for an internal web application. The web application is pretty simple and the authentication is currently being managed by a database with usernames and passwords. I want to replace the existing accounts stored in the database with Google Apps accounts. I have downloaded the latest DotNetOpenAuth-3.4.3.10103 package and got the OpenIdRelyingPartyWebForms sample up and running on IIS. I have built my own login page using just a OpenIdButton object that points to a development Google domain. The button seems to work fine in FireFox, at least it is forwarding me to the Google Apps login, but nothing happens when I load the same page in IE. When I click the Google button, nothing happens, zip. The same is true for the Yahoo button in the login.aspx page given in the sample. Here is the .aspx code I am using... <rp:OpenIdButton runat="server" ImageUrl="http://www.google.com/accounts/google_transparent.gif" Text="Login with Google!" ID="googleLoginButton" Identifier="https://www.google.com/accounts/o8/site-xrds?hd=dev.connexcloud.com" />

    Read the article

  • Error accessing uncompiled pages using ISA Server 2006 SP1

    - by Bravax
    We are in the processing of configuring a portal to use ISA Server as our front end security provider. So we are using ISA Server 2006 SP1. Unfortunately when we access .net applications through ISA Server, the first time they are accessed. i.e. They are not compiled yet, the following error appears: Error Code: 500 Internal Server Error. The parameter is incorrect. (87) In the ISA Monitoring logs, this shows: Failed Connection Attempt Log type: Web Proxy (Reverse) Status: 87 The parameter is incorrect. Once the application is compiled, the error never appears. Does anyone know how to resolve this, so the site works correctly the first time? Some additional information: The websites accessed are running on windows server 2008 64 bit - standard edition, and occurs for Sharepoint as well as standard .net websites. ISA Server is running on Windows server 2003 R2 SP2 Standard eidtion The firewall on the windows server 2008 box allows all access. (To rule this out.) Nothing odd appears in the IIS logs or firewall logs.

    Read the article

  • ASP.NET Web Application: use 1 or multiple virtual directories

    - by tster
    I am working on a (largish) internal web application which has multiple modules (security, execution, features, reports, etc.). All the pages in the app share navigation, CSS, JS, controls, etc. I want to make a single "Web Application" project, which includes all the pages for the app, then references various projects which will have the database and business logic in them. However, some of the people on the project want to have separate projects for the pages of each module. To make this more clear, this is what I'm advocating to be the projects. /WebInterface* /SecurityLib /ExecutionLib etc... And here is what they are advocating: /SecurityInterface* /SecutiryLib /ExecutionInterface* /ExecutionLib etc... *project will be published to a virtual directory of IIS Basically What I'm looking for is the advantages of both approaches. Here is what I can think of so far: Single Virtual Directory Pros Modules can share a single MasterPage Modules can share UserControls (this will be common) Links to other modules are within the same Virtual directory, and thus don't need to be fully qualified. Less chance of having incompatible module versions together. Multiple Virtual Directories Pros Can publish a new version of a single module without disrupting other modules Module is more compartmentalized. Less likely that changes will break other modules. I don't buy those arguments though. First, using load balanced servers (which we will have) we should be able to publish new versions of the project with zero downtime assuming there are no breaking database changes. Second, If something "breaks" another module, then there is either an improper dependency or the break will show up eventually in the other module, when the developers copy over the latest version of the UserControl, MasterPage or dll. As a point of reference, there are about 10 developers on the project for about 50% of their time. The initial development will be about 9 months.

    Read the article

  • Have you had DLL's fail after upgrading to 64 bit server?

    - by quakkels
    Hey All, I'm wondering if anyone else has experienced failed DLL's after upgrading their servers. My company is in the process of upgrading our code and server's after ten years of using classic ASP. We've set up our new server running Windows 2008 and IIS 7. Our classic ASP code and our new asp.net mvc code work pretty well. Our problems started happening when we began moving our old websites to the new server. When trying to load the page on the actual server machine's browser, we initially got a 500 error. If we refreshed the page then some of the page would load but then display an error: Server object error 'ASP 0177 : 800401f3' Server.CreateObject Failed /folder/scriptname.asp, line 24 800401f3 btw: On remote machines we would just get 500 errors. Line 24 is the first executable code in the script: '23 lines of comments set A0SQL_DATA = server.createobject("olddllname.Data") 'the rest of the script That specific line is trying to use a ten year old DLL to create a server object. I don't think the server configuration is a problem because I'm able to create "adodb.recordset" server objects without any problems. Is there an issue when running correctly registered old DLL's on 64 bit systems? Is there a way to get old DLL's working on 64 bit systems?

    Read the article

  • ASP.NET - Accessing copied content

    - by James Kolpack
    I have a class library project which contains some content files configured with the "Copy if newer" copy build action. This results in the files being copied to a folder under ...\bin\ for every project in the solution. In this same solution, I've got a ASP.NET web project (which is MVC, by the way). In the library I have a static constructor load the files into data structures accessible by the web project. Previously I've been including the content as an embedded resource. I now need to be able to replace them without recompiling. I want to access the data in three different contexts: Unit testing the library assembly Debugging the web application Hosting the site in IIS For unit testing, Environment.CurrentDirectory points to a path containing the copied content. When debugging however, it points to C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE. I've also looked at Assembly.GetExecutingAssembly().Location which points to C:\Windows\Microsoft.NET\Framework\v2.0.50727\Temporary ASP.NET Files\root\c44f9da4\9238ccc\assembly\dl3\eb4c23b4\9bd39460_f7d4ca01\. What I need is to the physical location of the webroot \bin folder, but since I'm in a static constructor in the library project, I don't have access to a Request.PhysicalApplicationPath. Is there some other environment variable or structure where I can always find my "Copy if newer" files?

    Read the article

  • How do I nicely manage many localhost web site URIs with IIS7

    - by Simeon
    I'm having trouble setting up a clean development environment with all the web sites I'm working on. I'm working on up to 40 different web sites, and at least 5 of them simultaneously. I need them all to be in a site root, for URL management to work with all CMSes. My first attempt was to use increasing port numbers for them, beginning with localhost:1000 and working upwards. Unfortunately, it took a great deal of looking up which port belonged to which web site, and it was very irritating. My second try was mapping the irritating ports to real words using the hosts file. So I ended up with localhost.tele2, localhost.ikea, localhost.volvo etc. Unfortunately, this takes a long time to set up (cleaning and adding to the hosts file, setting web site with highest port number in IIS etc.) and regularly I have to flush the DNS cache in order to get some sites working that I've added/removed from the hosts file. So how do I organize a lot of web sites in IIS7 nicely? Perhaps I've missed a very clever method that you're using.

    Read the article

  • Running code/script as a result of a form submission in ASP.NET

    - by firmbeliever
    An outside vendor did some html work for us, and I'm filling in the actual functionality. I have an issue that I need help with. He created a simple html page that is opened as a modal pop-up. It contains a form with a few input fields and a submit button. On submitting, an email should be sent using info from the input fields. I turned his simple html page into a simple aspx page, added runat=server to the form, and added the c# code inside script tags to create and send the email. It technically works but has a big issue. After the information is submitted and the email is sent, the page (which is supposed to just be a modal pop-up type thing) gets reloaded, but it is now no longer a pop-up. It's reloaded as a standalone page. So I'm trying to find out if there is a way to get the form to just execute those few lines of c# code on submission without reloading the form. I'm somewhat aware of cgi scripts, but from what I've read, that can be buggy with IIS and all. Plus I'd like to think I could get these few lines of code to run without creating a separate executable. Any help is greatly appreciated.

    Read the article

  • Create a new webpage using a WCF service

    - by sweeney
    Hello, I'd like to create WCF operation contract which consumes a string, produces a public webpage and then returns the address of this new page. [ServiceContract(Namespace = "")] public class DatabaseService { [OperationContract] public string BuildPage(string fileName, string html) { //writes html to file //returns public file url } } This doesn't seem like it should be complicated but i cant figure out how to do it. So far what i've tried is this: [OperationContract] public string PrintToFile(string name, string text) { FileInfo f = new FileInfo(name); StreamWriter w = f.AppendText(); w.Write(text); w.Close(); return f.Directory.ToString(); } Here's the problem. This does not create a file in the web root, it creates it in the directory where the webdav server is running. When i run this on an IIS server it seems to do nothing at all (at least not that i can tell). How can I get a handle to the webroot programmatically so that i can place the resultant file there and then return the public URL? I can tack on the domain name after the fact without issue so if it only returns the relative path to the file that's fine. Thanks, brian

    Read the article

< Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >