Search Results

Search found 4914 results on 197 pages for 'iis 7 5'.

Page 187/197 | < Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >

  • Castle windsor security exception

    - by Sunil
    I developed a small WCF service that uses Castle Windsor IoC container and it works fine on my PC. When I deploy it onto a Win 2008 R2 server and host the WCF service in IIS 7 it fails with the following error. I checked the server level web.config and the trust level is set to "Full". What do I need to do to get this to work. As a test I deployed the same service as it is onto a Windows 2003 server with the trust level set to "Full" and it works fine. I am unable to figure out what setting/configuration I am missing on the 2008 server that is making the service fail. Stack Trace: [SecurityException: That assembly does not allow partially trusted callers.] Castle.Windsor.WindsorContainer..ctor() +0 WMS.ServiceContractImplementation.IoC.IoCInstanceProvider..ctor(Type serviceType) in D:\WCF\WCFProofOfConcept\WMSServices \WMS.ServiceContractImplementation\IoC\IoCInstanceProvider.cs:19 WMS.ServiceContractImplementation.IoC.IoCServiceBehavior.ApplyDispatchBehav­ior(ServiceDescription serviceDescription, ServiceHostBase serviceHostBase) in D:\WCF \WCFProofOfConcept\WMSServices\WMS.ServiceContractImplementation\IoC \IoCServiceBehavior.cs:24 System.ServiceModel.Description.DispatcherBuilder.InitializeServiceHost(Ser­viceDescription description, ServiceHostBase serviceHost) +377 System.ServiceModel.ServiceHostBase.InitializeRuntime() +37 System.ServiceModel.ServiceHostBase.OnBeginOpen() +27 System.ServiceModel.ServiceHostBase.OnOpen(TimeSpan timeout) +49 System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout) +261 System.ServiceModel.HostingManager.ActivateService(String normalizedVirtualPath) +121 System.ServiceModel.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath) +479

    Read the article

  • Visual Studio 2010 / ASP.NET MVC 2 / Publish

    - by SevenCentral
    I just did a clean install on Windows 7 x64 Professional with the final release of Visual Studio 2010 Premium. In order to duplicate what I'm experiencing do the following in: Create a new ASP.NET MVC 2 Web Application Right click the project and select Properties On the Web tab, select "Use Local IIS Web Server" Click on Create Virtual Directory Save all Unload the project Edit the project file Change MvcBuildViews to true Save all Reload project Right click the project and select Publish Choose the file system publish method Enter a target location Choose Delete all existing files Select Publish Right click the project Select Publish Each time I do the above I get the following errror: "It is an error to use a section registered as allowDefinition='MachineToApplication' beyond application level..." The error originates from obj\debug\package\packagetmp\web.config, relative to the project directory. I can repeat this all day long with any MVC 2 project I've built. In order to fix this problem, I need to set MvcBuildViews to false in the project file. That's not really an option. This wasn't a problem in Visual Studio 2008 and it seems to be an issue with the way the Publish command stages files beneath the project directory. Can anyone else duplicate this error? Is this a bug or by design? Is there a fix, workaround, etc...? Thanks.

    Read the article

  • web service slowdown

    - by user238591
    Hi, I have a web service slowdown. My (web) service is in gsoap & managed C++. It's not IIS/apache hosted, but speaks xml. My client is in .NET The service computation time is light (<0.1s to prepare reply). I expect the service to be smooth, fast and have good availability. I have about 100 clients, response time is 1s mandatory. Clients have about 1 request per minute. Clients are checking web service presence by tcp open port test. So, to avoid possible congestion, I turned gSoap KeepAlive to false. Until there everything runs fine : I bearly see connections in TCPView (sysinternals) New special synchronisation program now calls the service in a loop. It's higher load but everything is processed in less 30 seconds. With sysinternals TCPView, I see that about 1 thousands connections are in TIME_WAIT. They slowdown the service and It takes seconds for the service to reply, now. Could it be that I need to reset the SoapHttpClientProtocol connection ? Someone has TIME_WAIT ghosts with a web service call in a loop ?

    Read the article

  • redirecting the root domain - SEO and other issues, need some guidance!

    - by Jim Sp
    I'm not familiar with some of these forwarding methods and I need help. My issue is this: I have a site hosted on discountasp.net. My domain was registered through 1&1 and I redirected the DNS to what discountasp.net wanted. So when a user types www.mydomain.com, he/she sees the ASP.NET site hosted on discountasp.net, which is all fine My main page is Index.aspx, I really suck at html page design and I don't have time or the talent to fiddle with it (or money to get it done by a pro). The rest of the pages are fine. I want to use a good theme from tumblr or bloggr - one of the blog sites and create a page that I want to use as the first page - directly on blogger or tumblr - say yyy.blogspot.com (I have many reasons, so for now please don't bash my decision - let's just say that's what I want). That means when a user types www.mydomain.com, it should redirect it to the blogger or tumblr page. Everything else stays the sme - the links on the blogger page will say www.mydomain.com/xxxx and show up what's on the hosted website. I have setup the IIS rewrite rules etc. etc. so that all works just fine The bottom line is I want to show an external site's web page as my root page. I suppose I'm struggling to even explain what I want! I can of course do a response.redirect on the Index.aspx page - which is the simplest way to manage this, but the big question is will this hurt SEO in some way? If not, that would be what I do and leave the rest of the infrastructure intact (I have already done this to test and it works fine) Thank you very much j

    Read the article

  • About Interview structure for test automation lab developers

    - by Ikaso
    Hi, I am interviewing new applicants for a team that is doing test automation on our company product(s). The team is composed of junior software developers and a team leader. The product runs on windows and has both managed and unmanaged parts. The test automation is done on both client side (user mode and kernel mode) and server side (IIS, Windows Services, backend). We are doing mainly intergration tests and black box tests. I am trying to figure out how to organize my interview. My overall idea is to ask about a project they have done, then ask some technical questions (multithreading, GC, design patterns) and one programming question. Please note that there is another interview done before me with 2 programming questions. My programming question is rather simple (for example: reversing a singly-linked linked list). My coworkers think that my questions will not find good developers since my questions are rather simple and well known, but so far most of the applicants fail those questions. My questions are: Should I change the structure of my interview for this kind of job? What questions do you ask to figure our if the applicant is test oriented? (Maybe I should provide a buggy implementation of a problem and let them find the bugs and then ask them about what tests they would have done) Regards,

    Read the article

  • Forwarding HTTP Request with Direct Server Return

    - by Daniel Crabtree
    I have servers spread across several data centers, each storing different files. I want users to be able to access the files on all servers through a single domain and have the individual servers return the files directly to the users. The following shows a simple example: 1) The user's browser requests http://www.example.com/files/file1.zip 2) Request goes to server A, based on the DNS A record for example.com. 3) Server A analyzes the request and works out that /files/file1.zip is stored on server B. 4) Server A forwards the request to server B. 5) Server B returns file1.zip directly to the user without going through server A. Note: steps 4 and 5 must be transparent to the user and cannot involve sending a redirect to the user as that would violate the requirement of a single domain. From my research, what I want to achieve is called "Direct Server Return" and it is a common setup for load balancing. It is also sometimes called a half reverse proxy. For step 4, it sounds like I need to do MAC Address Translation and then pass the request back onto the network and for servers outside the network of server A tunneling will be required. For step 5, I simply need to configure server B, as per the real servers in a load balancing setup. Namely, server B should have server A's IP address on the loopback interface and it should not answer any ARP requests for that IP address. My problem is how to actually achieve step 4? I have found plenty of hardware and software that can do this for simple load balancing at layer 4, but these solutions fall short and cannot handle the kind of custom routing I require. It seems like I will need to roll my own solution. Ideally, I would like to do the routing / forwarding at the web server level, i.e. in PHP or C# / ASP.net. However, I am open to doing it at a lower level such as Apache or IIS, or at an even lower level, i.e. a custom proxy service in front of everything.

    Read the article

  • Use of RegularExpressionValidator in MS .Net Framework

    - by Lawk Salih
    Hi- I am trying to use the RegularExpressionValidator Control in Visual Studio to validate a textbox for email address expressions. Here's my code (very basic) Email:&nbsp; <input id="Text1" type="text" /><br /> <br /> <input id="Validate" type="button" value="Validate" runat="server" /><br /> <br /> <asp:RegularExpressionValidator ID="RegularExpressionValidator1" runat="server" ErrorMessage="RegularExpressionValidator" ControlToValidate="Validate" ValidationExpression="\w+([-+.']\w+)*@\w+([-.]\w+)*\.\w+([-.]\w+)*"></asp:RegularExpressionValidator> Now, when I set controlToValidate to Validate, I get the following error message: Server Error in '/' Application. Control 'Validate' referenced by the ControlToValidate property of 'RegularExpressionValidator1' cannot be validated. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Web.HttpException: Control 'Validate' referenced by the ControlToValidate property of 'RegularExpressionValidator1' cannot be validated. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [HttpException (0x80004005): Control 'Validate' referenced by the ControlToValidate property of 'RegularExpressionValidator1' cannot be validated.] System.Web.UI.WebControls.BaseValidator.CheckControlValidationProperty(String name, String propertyName) +8734357 System.Web.UI.WebControls.BaseValidator.ControlPropertiesValid() +40 System.Web.UI.WebControls.BaseValidator.get_PropertiesValid() +21 System.Web.UI.WebControls.BaseValidator.OnPreRender(EventArgs e) +27 System.Web.UI.Control.PreRenderRecursiveInternal() +80 System.Web.UI.Control.PreRenderRecursiveInternal() +171 System.Web.UI.Control.PreRenderRecursiveInternal() +171 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +842 Version Information: Microsoft .NET Framework Version:2.0.50727.3607; ASP.NET Version:2.0.50727.3082 I have tried the same mechanism before and have worked; however, this time I am on a new machine, WinXP Pro running IIS v5.1. Any direction would be appreciated.

    Read the article

  • Does .Net use Device Dependent or Device Independent Bitmaps?

    - by Brian
    When loading an image into memory, does .Net use DDB, DIB, or something else entirely? If possible, please cite your sources. I'm wondering because we currently have a classic ASP application that is using a 3rd party component to load images that is occasionally creating a “Not enough storage is available to process this command.” error. The error is very inconsistent but tends to happen on larger images (not always, but often). After resetting IIS, processing the same file again typically works just fine. After much research I have found that DDBs tend to have this problem when processing large images because they work out of video memory. Considering that we are running on a web server with an integrated video card and limited shared memory, this could certainly be our problem. We are in the early stages of converting our app to .Net and am wondering if using .Net for this might be a viable alternative to our current method which is why I am asking the question. Any advice is welcome :) but out of curiosity if nothing else, I am really hoping for an answer to the question; does .Net use DDB or DIB?

    Read the article

  • Database cache that I'm not aware of?

    - by Martin
    I'm using asp.net mvc, linq2sql, iis7 and sqlserver express 2008. I get these intermittent server errors, primary key conflicts on insertion. I'm using a different setup on my development computer so I can't debug. After a while they go away. Restarting iis helps. I'm getting the feeling there is cache somewhere that I'm not aware of. Can somebody help me sort out these errors? Cannot insert duplicate key row in object 'dbo.EnquiryType' with unique index 'IX_EnquiryType'. Edits regarding Venemos answer Is it possible that another application is also accessing the same database simultaneously? Yes there is, but not this particular table and no inserts or updates. There is one other table with which I experience the same problem but it has to do with a different part of the model. How often an in what context do you create a new DataContext instance? Only once, using the singleton pattern. Are the primary keys generated by the database or by the application? Database. Which version of ASP.NET MVC and which version of .NET are you using? RC2 and 3.5.

    Read the article

  • WCF newbie - how to install and use a SSL certificate?

    - by Shaul
    This should be a snap for anyone who's done it before... I'm trying to set up a self-hosted WCF service using NetTcpBinding. I got a trial SSL certificate from Thawte and successfully installed that in my IIS store, and I think I've got it correctly set up in the service - at least it doesn't exception out on me! Now, I'm trying to connect the client (this is still all on my dev machine), and it's giving me an error, "Message = "The X.509 certificate CN=ssl.mydomain.com, OU=For Test Purposes Only. No assurances., OU=IT, O=My Company, L=My Town, S=None, C=IL chain building failed. The certificate that was used has a trust chain that cannot be verified. Replace the certificate or change the certificateValidationMode. A certificate chain processed, but terminated in a root certificate which is not trusted by the trust provider." Ooookeeeey... now what? Client code (I want to do this in code, not app.config): var baseAddress = "localhost"; var factory = new DuplexChannelFactory<IMyWCFService>(new InstanceContext(SiteServer.Instance)); factory.Endpoint.Address = new EndpointAddress("net.tcp://{0}:8000/".Fmt(baseAddress)); var binding = new NetTcpBinding(SecurityMode.Message); binding.Security.Message.ClientCredentialType = MessageCredentialType.UserName; factory.Endpoint.Binding = binding; var u = factory.Credentials.UserName; u.UserName = userName; u.Password = password; return factory.CreateChannel()

    Read the article

  • Deploying DotNetNuke and separate ASP.NET Application together - Possible Issues?

    - by TheTXI
    I am making this in a proactive attempt to head off any potential problems which could arise from this. The situation is that we are developing an ASP.NET application for a client which will handle the online ordering from their customers. This application is going to be using the same database that their current WinForms application uses (no real issue here). At the same time we are developing a new front-end website for them using DotNetNuke. The DotNetNuke app will simply be linking to the ASP.NET application for the customers to submit their orders (no need for them to communicate back and forth, etc.) The plan is to host both applications on the same box at the client location. What I am looking for are potential problems or setup tips which would prevent possible conflict between the two apps (web.config conflicts, etc.) Is there a problem with having both hosted on the same location, how should IIS be set up, etc.? If there are any external resources also available which could address this, please feel free to link them as well.

    Read the article

  • Controller actions appear to be synchronous though on different requests?

    - by Oded
    I am under the impression that the below code should work asynchronously. However, when I am looking at firebug, I see the requests fired asynchronously, but the results coming back synchronously: Controller: [HandleError] public class HomeController : Controller { public ActionResult Status() { return Content(Session["status"].ToString()); } public ActionResult CreateSite() { Session["status"] += "Starting new site creation"; Thread.Sleep(20000); // Simulate long running task Session["status"] += "<br />New site creation complete"; return Content(string.Empty); } } Javascript/jQuery: $(document).ready(function () { $.ajax({ url: '/home/CreateSite', async: true, success: function () { mynamespace.done = true; } }); setTimeout(mynamespace.getStatus, 2000); }); var mynamespace = { counter: 0, done: false, getStatus: function () { $('#console').append('.'); if (mynamespace.counter == 4) { mynamespace.counter = 0; $.ajax({ url: '/home/Status', success: function (data) { $('#console').html(data); } }); } if (!mynamespace.done) { mynamespace.counter++; setTimeout(mynamespace.getStatus, 500); } } } Addtional information: IIS 7.0 Windows 2008 R2 Server Running in a VMWare virutual machine Can anyone explain this? Shouldn't the Status action be returning practically immediately instead of waiting for CreateSite to finish? Edit: How can I get the long running process to kick off and still get status updates?

    Read the article

  • VS ASP.NET 500 Server Error

    - by rlb.usa
    Hey guys, I'm having a super weird problem with my VS 2008 solution. We had this hand-coded ASP.NET compiled web app on our old IIS6/Win2003 server, working great, moved it to our new IIS7/Win2008 server, still working great, but when I try to compile the application and publish it again to our new Win2008 server, I get server 500 errors. It's ASP.NET 2.0 with AJAX extensions and AJAX control toolkit. I'm not too great with server issues, or even sure if it is a server issue but here are some more symptoms... ? I know the website works (it only differs by some minor code fixes) and can use it's code on a development machine, there are no errors, and it publishes fine. Publishing (using the DLL files), and even not publishing and trying to use the code-behind files on our new server, both no success. The old website does work on the new server just fine. If I put a simple hello world html page in the website's virtual directory, with the old code, it works fine, but with the new code, that html page gets the 500 error. And in fact, oddly, I can add all the files to the website, only when I add the web.config, do I get the 500 error. The web.config has not changed. Tried stopping and restarting IIS What's the problem, here? Any ideas, what else can I do to troubleshoot the problem?

    Read the article

  • Request header field x-user-session is not allowed by Access-Control-Allow-Headers

    - by Saurabh Bhandari
    I am trying to do a CORS call to a WCF service endpoint hosted on IIS7.5. I have configured custom headers in IIS. My configuration looks like below <customHeaders> <add name="Access-Control-Allow-Methods" value="GET,PUT,POST,DELETE,OPTIONS" /> <add name="Access-Control-Allow-Headers" value="x-user-session,origin, content-type, accept" /> <add name="Access-Control-Allow-Credentials" value="true" /> </customHeaders> When I do a POST request I get following error message "Request header field x-user-session is not allowed by Access-Control-Allow-Headers" If I remove my custom header from the call and run it, everything works fine. Also if I do a GET call with custom header then also API works correctly. $.ajax({ type:"POST", success: function(d) { console.log(d) }, timeout: 9000, url: "http://api.myserver.com/Services/v2/CreditCard.svc/update_cc_detail", data: JSON.stringify({"card_id": 1234,"expire_month":"11","expire_year":"2020","full_name":"Demo Account", "number":"4111111111111111","is_primary":true}), xhrFields: { withCredentials: true}, headers: { x-user-session': "B23680D0B8CB5AFED9F624271F1DFAE5052085755AEDDEFDA3834EF16115BCDDC6319BD79FDCCB1E199BB6CC4D0C6FBC9F30242A723BA9C0DFB8BCA3F31F4C7302B1A37EE0A20C42E8AFD45FAB85282FCB62C0B4EC62329BD8573FEBAEBC6E8269FFBF57C7D57E6EF880E396F266E7AD841797792619AD3F1C27A5AE" }, crossDomain: true, contentType: 'application/json' });

    Read the article

  • NHibernate will not insert a record

    - by Brian Beckham
    I have an application that is now 4+ years old that is exhibiting some odd behavior on our latest deployment. The application uses nHibernate for all inserts / updates / selects, etc. We are currently using .NET 2.0, and nHibernate 1.2 (I know, we need to upgrade) This deployment is on Windows 2008 Server x64, IIS 7.5 - what I have seen so far is that the application runs, but is unable to insert or update records in the DB - reads seem fine so far, but writes are a problem. SOME writes actually work, inserts into some small tables, but most never even make it to the DB. Using SQL Profiler, the insert / updates never make it to the server, and turning log4net up to DEBUG, and show_sql true - the select statements appear, but the insert / update statements never make it into the log at all, and never show up at the server. What's even more odd is that the application seems to be oblivious to this - the commandandclose runs without exception (open session in view with an httpmodule), the domain objects come back with uuid's generated, etc. but never get persisted. Certainly an upgrade is due, but I would hate to try it during a deployment, and without time to accurately test the app. Any ideas?

    Read the article

  • Providing downloads on ASP.net website

    - by Dave
    I need to provide downloads of large files (upwards of 2 GB) on an ASP.net website. It has been some time since I've done something like this (I've been in the thick-client world for awhile now), and was wondering on current best practices for this. Ideally, I would like: To be able to track download statistics: # of downloads is essential; actual bytes sent would be nice. To provide downloads in a way that "plays nice" with third-party download managers. Many of our users have unreliable internet connections, and being able to resume a download is a must. To allow multiple users to download the same file simultaneously. My download files are not security-sensitive, so providing a direct link ("right-click to download...") is a possibility. Is just providing a direct link sufficient, letting IIS handle it, and then using some log analyzer service (any recommendations?) to compile and report the statistics? Or do I need to intercept the download request, store some info in a database, then send a custom Response? Or is there an ASP.net user control (built-in or third party) that does this? I appreciate all suggestions.

    Read the article

  • How to set up a load/stress test for a web site?

    - by Ryan
    I've been tasked with stress/load testing our company web site out of the blue and know nothing about doing so. Every search I make on google for "how to load test a web site" just comes back with various companies and software to physically do the load testing. For now I'm more interested in how to actually go about setting up a load test like what I should take into account prior to load testing, what pages within my site I should be testing load against and what things I'm going to want to monitor when doing the test. Our web site is on a multi-tier system complete with a separate database server (IIS 7 Web Server, SQL Server 2000 db). I imagine I'd want to monitor both the web server and the database server for testing load however when setting up scenarios to load test the web server I'd have to use pages that query the database to see any load on the database server at the same time. Are web servers and database servers generally tested simultaneously or are they done as separate tests? As you can see I'm pretty clueless as to the whole operation so any incite as to how to go about this would be very helpful. FYI I have been tinkering with Pylot and was able to create and run a scenario against our site but I'm not sure what I should be looking for in the results or if the scenario I created is even a scenario worth measuring for our site. Thanks in advance.

    Read the article

  • C#: Populating a UI using separate threads.

    - by Andrew
    I'm trying to make some sense out of an application Ive been handed in order to track down the source of an error. Theres a bit of code (simplified here) which creates four threads which in turn populate list views on the main form. Each method gets data from the database and retrieves graphics from a resource dll in order to directly populate an imagelist and listview. From what Ive read on here (link) updating UI elements from any thread other than the UI thread should not be done, and yet this appears to work? Thread t0 = new Thread(new ThreadStart(PopulateListView1)); t0.IsBackground = true; t0.Start(); Thread t1 = new Thread(new ThreadStart(PopulateListView2)); t1.Start(); Thread t2 = new Thread(new ThreadStart(PopulateListView3)); t2.Start(); Thread t3 = new Thread(new ThreadStart(PopulateListView4)); t3.Start(); The error itself is a System.InvalidOperationException "Image cannot be added to the ImageList." which has me wondering if the above code is linked in some way. Iis this method of populating the UI recommended and if not what are the possible complications resulting from it?

    Read the article

  • Extend legacy site with another server-side programming platform best practice

    - by Andrew Florko
    Company I work for have a site developed 6-8 years ago by a team that was enthusiastic enough to use their own private PHP-based CMS. I have to put dynamic data from one intranet company database on this site in one week: 2-3 pages. I contacted company site administrator and she showed me administrative part - CMS allows only to insert html blocks & manage site map (site is deployed on machine that is inside company & fully accessible & upgradeable). I'm not a PHP-guy & I don't want to dive into legacy hardly-who-ever-heard-about CMS engine I also don't want to contact developers team, 'cos I'm not sure they are still present and capable enough to extend this old days site and it'll take too much time anyway. I am about to deploy helper asp.net site on IIS with 2-3 pages required & refer helper site via iframe from present site. New pages will allow to download some dynamic content from present site also. Is it ok and what are the pitfalls with iframe approach?

    Read the article

  • Re-authentication required for registered-path links (to ASP.NET site) coming to IE from PowerPoint

    - by Daniel Halsey
    We're using URL routing based on Phil Haack's example, with config modifications based on MSDN Library article #CC668202, to provide "shareable" links for a ASP.NET forms site, and have run into a strange issue: For users attempting to open links from PowerPoint presentations, and who have IE set as their default browser, using one of these links forces (forms-based) re-authentication, even in the same browser instance with a live session. Info: We know the session is still alive. (Page returns information for the currently logged-in user; confirmed via debug watches) This doesn't happen with other browsers (FF, Chrome) or with other programs (Notepad++) as the URL source. We do not have a default path set, as this caused issues with root path handling at initial login. This primarily happens with PowerPoint, but will also happen in Word and OCS. On some machines, even after changing the default browser, Office apps will continue to use IE for these links, forcing this error. (A potential registry fix for this failed, but even if it had worked, we can't control default browser choice for our users.) We can't figure out if this is an Office oddity or is being caused by our decision to use app-level URL routing (rather than IIS rewriting). Has anyone else encountered this and found a solution?

    Read the article

  • Silverlight 4, Google Chrome, and HttpWebRequest problem

    - by synergetic
    My Silvrlight 4 application hosted in ASP.NET MVC 2 working fine when used through Internet Explorer 8, both in development server and remote web server (IIS 6.0). However when I try to browse through Google Chrome (version 5.0.375.70) it throws "remote server returned not found" error. The code causing the problem is the following: public class MyWebClient { private HttpWebRequest _request; private Uri _uri; private AsyncOperation _asyncOp; public MyWebClient(Uri uri) { _uri = uri; } public void Start(XElement data) { _asyncOp = AsyncOperationManager.CreateOperation(null); _data = data; _request = (HttpWebRequest)WebRequest.Create(_uri); _request.Method = "POST"; _request.BeginGetRequestStream(new AsyncCallback(BeginRequest), null); } private void BeginRequest(IAsyncResult result) { Stream stream = _request.EndGetRequestStream(result); using (StreamWriter writer = new StreamWriter(stream)) { writer.Write(((XElement)_data).ToString()); } stream.Close(); _request.BeginGetResponse(new AsyncCallback(BeginResponse), null); } private void BeginResponse(IAsyncResult result) { HttpWebResponse response = (HttpWebResponse)_request.EndGetResponse(result); if (response != null) { //process returned data ... } } ... } In short, the above code sends some XML data to web server (to ASP.NET MVC controller) and gets back a processed data. It works when I use Internet Explorer 8. Can someone please explain what is the problem with Google Chrome?

    Read the article

  • MySQL + Joomla + remote c# access

    - by Jimmy
    Hello, I work on a Joomla web site, installed on a MySQL database and running on IIS7. It's all working fine. I now need to add functionality that lets (Joomla-)registered users change some configuration data. Though I haven't done this yet, it looks straightforward enough to do with Joomla. The data is private so all external access will be done through HTTPS. I also need an existing c# program, running on another machine, to read that configuration data. Sure enough, this data access needs to be as fast as possible. The data will be small (and filtered by query), but the latency should be kept to a minimum. A short-term, client-side cache (less than a minute, in case a user updates his configuration data) seems like a good idea. I have done practically zero database/asp programming so far, so what's the best way of doing that last step? Should the c# program access the database 'directly' (using what? LINQ?) or setup some sort of Facade (SOAP?) service? If a service should be used, should it be done through Joomla or with ASP on IIS? Thanks

    Read the article

  • firefox does not load large size images

    - by Pradeep
    I am stuck with a kind of bug in FF, wherein it’s unable to load images of big size (I have 8 MB size of image) from the server. The loading of image is all fine on IE. I am still looking out for ways to get rid of this problem. I changed server(IIS) settings to allow bigger file sizes. Also, I used “load” event on image using JQuery and tried all sort of options listed here http://api.jquery.com/load-event/, but nothing worked so far. If anyone of you has come across any such similar problem, and a way to resolve it, it would be nice to hear from you Please note: high resolution images are part of the requirement. Code : <style> img { background-color: #FFFFFF; background-image: url(http://eremurus.hyd:8080/QMS/plugin/imagepanner/loader.gif); background-repeat: no-repeat; background-position: center center; } </style> <script src="../plugin/jquery-ui-1.8.7.custom/js/jquery-1.4.4.min.js" type="text/javascript"></script> <script> jQuery(document).ready(function($){ ///var _url = "http://eremurus.hyd:8080/QMS/plugin/imagepanner/floorPlan.jpg"; // set up the node / element _im =$("#main"); //_im.bind("load",function(){ $(this).fadeIn(); }); // set the src attribute now, after insertion to the DOM //_im.attr('src',_url); $("#main").one("load",function(){ alert('loaded'); }) .each(function(){ if(this.complete){ $(this).trigger("load"); } }); }); </script> </head> <body> <div id="target"><img id='main' src="http://eremurus.hyd:8080/QMS/plugin/imagepanner/floorPlan.jpg"> </img></div> </body> </html>

    Read the article

  • Integrated Security on Reporting Services XML Datasource

    - by Nathan
    Hey all, I am working on setting up my report server to use a web service as an XML datasource. I seem to be having authentication issues between the web service and the report with I choose to use Integrated security. Here's what I have: 1) I have a website w/ an exposed service. This website is configured to run ONLY on Integrated Security. This means that we have all other modes turned off AND Enabled anonymous access turned off under directory security. 2) Within the Web.config of the website, I have the authentication mode set to Windows. 3) I have the report datasource set to being an XML data source. I have the correct URL to the service and have it set to Windows Integrated Security. Since I am making a hop from the Browser to the Reporting Server to the Web Service, I wonder if I am having an issue w/ Kerberos, but I am not sure. When I try to access the service, I get a 401 error. Here are the IIS logs that I am generating: 2011-01-07 14:52:12 W3SVC IP_ADDY POST /URL.asmx - 80 - IP_ADDY - 401 1 0 2011-01-07 14:52:12 W3SVC IP_ADDY POST /URL.asmx - 80 - IP_ADDY - 401 1 5 Has anyone worked out this issue before? Thanks!

    Read the article

  • Looking for resources to explain a security risk.

    - by Dave
    I've a developer which has given users the ability to download a zip archive which contains an html document which references a relative javascript file and flash document. The flash document accepts as one of it's parameters a url which is embedded in the html document. I believe that this archive is meant to be used as a means to transfer an advertisement to someone who would use the source to display the ad on their site, however the end user appears to want to view it locally. When one opens the html document the flash document is presented and when the user clicks on the flash document it redirects to this embedded url. However, if one extracts the archive on the desktop and opens the html document in a browser and clicks the flash object, nothing observable happens, they will not be redirected to the external url. I believe this is a security risk because one is transferring from the local computer zone to an external zone. I'm trying to determine the best way to explain this security risk in the simplest of terms to a very end user. They simply believe it's "broken" when it's not broken, they're being protected from a known vulnerability. The developer attempted to explain how to copy the files to a local iis instance, which I highly doubt is running on the users machine, and I do not consider this to be a viable explanation.

    Read the article

< Previous Page | 183 184 185 186 187 188 189 190 191 192 193 194  | Next Page >