Search Results

Search found 1108 results on 45 pages for 'aspnet webforms'.

Page 8/45 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Error message on using LinqDataSource in webforms

    - by naveen
    Coding Platform: ASP.NET 4.0 I am binding a GridView with LinqDataSource with AutoDelete functionality enabled. GridView is bound to the Products Table. I have a Products Table and a Category Table with an association on CategoryID. If I try to delete a Category that is referred in the Products Table I cannot do that. Its is totally acceptable, but I want the end user to be notified with some error message. Where to catch this error message?

    Read the article

  • What are some good resources for WebForms MVP?

    - by Nissan Fan
    I've seen a little buzz on ASP.NET Web Forms MVP, but where can I get resources? http://webformsmvp.com is pretty much stubbed out for now. This appears to be a compelling refresh of the Web Forms paradigm and bring into the fold things that make ASP.NET MVC great. I hear it's going to be put out there at MIX10 this week, but anyone have any useful sites/references?

    Read the article

  • Uniquely identifying mobile devices over a network for webforms

    - by Eric
    I'm designing a system for mobile devices that can be assigned only to one job at a time. So I need to be able to know which mobile device is being used by accessing it's own unique static IP address or its device ID. I don't want to assign an ID myself for every machine that comes in which is why a static IP would work great. However, in trying to retrieve the client ip address I'm retrieving the wireless router's ip or some other ip which is not the mobile device's ip. I want to store that ip in a table and control which jobs are assigned to it. How can I accomplish this? I've tried the following but I'm getting the wireless ip: var hostEntry = Dns.GetHostEntry(Dns.GetHostName()); var ip = ( from addr in hostEntry.AddressList where addr.AddressFamily.ToString() == "InterNetwork" select addr.ToString() ).FirstOrDefault(); I'd rather not set a cookie if there exists a better alternative. TIA!

    Read the article

  • Circular file references not allowed

    - by Program.X
    Hi, I am having a problem in building my solution in VS2008. Normally, it compiles fine in the environment. Sometimes, it fails with: /xxx_WEB/secure/CMSManagedTargetPage.aspx(1): error ASPPARSE: Circular file references are not allowed. I rebuild and it works fine. Now, however, I am in the middle of setting up a CruiseControl.NET system and am testing my checked out code with MSBuild before I integrate the build into CC. Now, everytime I MSBuild, I get: "Q:\cc\xxx\checked out from svn\xxx.sln" (default target) (1) -> (xxx_WEB target) -> /xxx_WEB/secure/CMSManagedTargetPage.aspx(1): error ASPPARSE: Circular file references are not allowed. Problem is, I can't see where this reference is. I have searched for the reference across the entire solution and canf ind no references to the page itself (CMSManagedTargetPage) anywhere other than in the page or its codebehind, or within a string, eg: C:\dev2008\xxx\IWW.xxx.ASPNET\AspxHttpHandler.cs(82): inputFile = context.Server.MapPath("~/secure/CMSManagedTargetPage.aspx"); C:\dev2008\xxx\IWW.xxx.ASPNET\AspxHttpHandler.cs(83): virtualPath = "~/secure/CMSManagedTargetPage.aspx"; My assembly references are also fine (as far as I know). My Web Application is at the "top" of the dependencies, and nothing references it and therefore the faulting page so cannot cause a circular reference. Of course, the page itself may reference something such as a UserControl within the same assembly/web site, but as mentioned earlier, a search on CMSManagedTargetPage yielded no results so this is not happening. Changing the batch attribute in web.config had no effect on MSBuild. I find it very odd that it "sometimes" fails in VS and always fails in MSBuild. Am I missing some subtlety?

    Read the article

  • Urlrewriting.net pages are not causing postbacks

    - by Nick
    I'm using webforms with UrlRewriting.Net to rewrite pages, e.g. http://www.example.com/stuff.aspx?c=30 becomes http://www.example.com/stuff/30-this-stuff.aspx. It works in so far as the correct content is loading; however, none of the postbacks are working (mostly buttons on the page). If I set up a breakpoint on Page_Load, I see that IsPostBack is always false. Any ideas on how to fix this? Right now I'm just on Visual Studio 2008. EDIT: I have since switched to UrlRewriter.Net, which worked after a few tweaks (see Scott Gu's article). Besides here, I have posted my original problem to the developer's forum: if I ever get an answer, I'll post it here (unless else posts it here first).

    Read the article

  • How To Create An ASP.NET Designer Host

    - by Zuhaib
    I have been asked to build an application where I can drag and drop few WebControls onto the WebPage designer surface. So far I have read some articles on .NET Framework Design-Time architecture, like, MSDN Article: Hosting WinForms Designers, Developer Function Article etc. But I can't a find a way to host WebForms designer. The only open source implementation that I could find was Mono Develop ASP.NET Visual Designer. But in mono they have used GTK# & Gecko etc to host the designer. I can't find a way to do the same using WinForms. Please suggest me what should I do. Give me some pointers. Should I go ahead and implement it using GTK# and Gecko? PS: My requirement is not very huge. I need just a way to drag and drop simple web controls and save the page.

    Read the article

  • Google index vs asp.net url routing

    - by vani
    I've made the change from querystring asp.net webforms application to an url routing one. Now I'm trying to let google know that there's a new set of urls to be aware of. The G has indexed 133 out of 179 new urls sent in sitemap.xml, but the site:mistral.hr command returns the old querystring version of the links. This could partly be so because I left the functionality of the old querystring version of the site(for backwards compatibility), and haven't done any 301 redirects(nor do I know how to achieve them in my shared hosting account). The old querystring and the new url routing both point to the same Default.aspx page.

    Read the article

  • asp.net dynamic sitemap with querystring? Session variable seems like overkill.

    - by Mike
    Hi, I'm using ASP.NET WebForms. I'm using the standard sitemap provider. Home User Account Entry Going to the home page should have a user selection screen. Clicking on a user should list out the user's accounts with options to edit, delete, add accounts. Selecting an account should list out all the user's account's entries with options to edit delete and add entries. How do you normally pass this information between pages? I could use the query string, but then, the sitemap doesn't work. The sitemap only has the exact page without the query string and therefore loses the information. /User/Account/List.aspx?User=123 /User/Account/Entry/List.aspx?User=123&Account=322 I could use a session variable, but this seems overkill. Thoughts and suggestions very appreciated. Thanks!

    Read the article

  • Invalid Viewstate

    - by murak
    I always got this error guys on my site.Anybody got a solution. Stacktrace at System.Web.UI.Page.DecryptStringWithIV(String s, IVType ivType) at System.Web.UI.Page.DecryptString(String s) at System.Web.Handlers.ScriptResourceHandler.DecryptParameter(NameValueCollection queryString) at System.Web.Handlers.ScriptResourceHandler.ProcessRequestInternal(HttpResponse response, NameValueCollection queryString, VirtualFileReader fileReader) at System.Web.Handlers.ScriptResourceHandler.ProcessRequest(HttpContext context) at System.Web.Handlers.ScriptResourceHandler.System.Web.IHttpHandler.ProcessRequest(HttpContext context) at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) Query String d=J_c3w3Q59U-PnoRlWBPOJMVgHe_9Ile9wANEXiRFLzG8mequestManager._initialize('ctl00%24ScriptManager1' I noticed that there are strings that got appended on the last part of ScriptResource.axd which are not part of the querystring(equestManager._initialize('ctl00%24ScriptManager1').I don't know how this string ends up here.I am using MS ajax, webforms and IIS7 on a shared hosting plan.

    Read the article

  • Prevent multiple logons for a single user in ASP .Net

    - by ilivewithian
    I am looking at how best to prevent a single user account logging on multiple times in a webforms application. I know that MembershipUser.IsOnline exists, but I've read a few forum and blog entries suggesting that this can be unreliable, particularly in scenarios where a user closes a browser (without logging out) and attempts to logon with a different machine or browser. I looked at implementing a last past the post type system; when a user logs on older users are simply kicked off. It seems that FormsAuthentication.Signout() only works for the current user. Am I missing a trick, is there a better way to prevent the same username logging on from multiple different locations?

    Read the article

  • Why won't binding to a child object property with rdlc Report work in vs2010?

    - by andrej351
    A while ago someone asked how to bind to a child object's property in a rdlc report. Question here. The solution was to use an expression like this: =Fields!ChildObject.Value.SomeProperty I have recently tried to upgrade to version 10 of the reporting libraries (Microsoft.ReportViewer.WebForms and Microsoft.ReportViewer.Common) and for some reason this does not work anymore. I have got the report rendering and displaying all data fine except any which uses this technique. Instead of the property value i get the text: "#Error" Why doesn't this work anymore? Anybody know how to to this in the new version?

    Read the article

  • Postback problem downloading zip file

    - by Chris Conway
    I've got a problem on a webforms application where a user selects some criteria from dropdowns on the page and hits a button on the page which calls this method: protected void btnSearch_Click(object sender, EventArgs e) They then click on button to download a zip file based on the criteria which calls this method: protected void btnDownload_Click(object sender, EventArgs e) In IE, they are prompted with the bar at the top of the browser that tells them "To help protect your security, Internet Explorer blocked this site from downloading files to your computer". When they click on that bar to download the file, it fires the btnSearch_Click event again. Response.ContentType and Response.AddHeader has been set up correctly. The problem is, that btnSearch appends criteria so basically it is being appended twice and causing problems. Is there something I can do to prevent this? This is a vs2008 web app using c# 3.5 for what it's worth. Thanks!

    Read the article

  • Using 'DataKeyNames' on an ASP.NET GridView control, causes exception on DataBind() call

    - by azam
    Hi I have an ASP.NET 3.5 WebForms application. Using a GridView control and the DataKeyNames attribute <asp:GridView ID="data" runat="server" DataKeyNames="ID"> <Columns> <asp:BoundField DataField="ID" /> <!-- this binds --> If I take out the DataKeyNames="ID" it binds the data. I would really like to try and use DataKeyNames so I don't need to have the ID column in the columns list. Error: DataBinding: 'Nmspace.MyClass' does not contain a property with the name '"ID"'. Please help.

    Read the article

  • Where to include business logic in a domain driven architecture

    - by Mike C.
    I'm trying to learn effective DDD practices as I go, but had a fundamental question I wanted to get some clarity on. I am using ASP.NET WebForms and I am creating a situation where a user places an order. Upon order submission, the code-behind retrieves the user, builds the order from the inputs on the form, calls the User.PlaceOrder() method to perform add the order object to the user's order collection, and calls the repository to save the record to the database. That is fairly simply and straightforward. Now I need to add logic to send an order confirmation email, and I'm not really sure the proper place to put this code or where to call it. In the olden days I would simply put that code in the code-behind and call it at the same time I was building the order, but I want to get a step closer to solid proper architecture so I wanted to get some information. Thanks for your help!

    Read the article

  • Is it possible to have a .NET route that maps to the same place as a directory?

    - by Austin
    I'm building a CMS using WebForms on .NET 4.0 and have the following route that allows URLs like www.mysite.com/about to be mapped to the Page.aspx page, which looks up the dynamic content. routes.MapPageRoute("page", "{name}", "~/Page.aspx"); The problem is that I have a couple of folders in my project that are interfering with possible URLs. For example, I have a folder called "blog" where I store pages related to handling blog functionality, but if someone creates a page for their site called "blog" then navigating to www.mysite.com/blog gets the following error: 403 - Forbidden: Access is denied. You do not have permission to view this directory or page using the credentials that you supplied. Other similar URLs route correctly, but I think because .NET is identifying /blog as a physical location on the server it is denying directory access. Is there a way to tell IIS / .NET to only look for physical files instead of files and folders?

    Read the article

  • asp.net web form Cutom Router Handler

    - by jparram
    I am using a custom route handler for a webforms application. I am using routes to determine localization. ie: if the url has es or fr in the route it will load either spanish or french resources. for example: www.someroute/es/checkstuff/checkstuff.aspx will load: www.someroute/checkstuff/checkstuff.aspx with the spanish resources. I am configuring the custom routes in global.asax via: protected void Application_Start(object sender, EventArgs e) { RegisterRoutes(RouteTable.Routes); } public static void RegisterRoutes(RouteCollection routes) { foreach (var value in _customRoutes) { routes.Add(value.RouteName, new Route(value.Route, new CustomRouteHandler(value.ResolvedRoute))); } } where _customroutes is a list of routes. Is there a way to do this with some kind of pattern matching so I can avoid adding a specific route for each page in the application. While I know I could use a t4 template to generate the routes, I guess I am looking for a dynamic way to create the list

    Read the article

  • Microsoft Declares the Future of ASP.NET is Web API

    - by sbwalker
    Sitting on a plane on my way home from Tech Ed 2012 in Orlando, I thought it would be a good time to jot down some key takeaways from this year’s conference. Some of these items I have known since the Microsoft MVP Summit which occurred in Redmond in late February ( but due to NDA restrictions I could not share them with the developer community at large ) and some of them are a result of insightful conversations with a wide variety of industry insiders and Microsoft employees at the conference. First, let’s travel back in time 4 years to the Microsoft MVP Summit in 2008. Microsoft was facing some heat from market newcomer Ruby on Rails and responded with a new web development framework of its own, ASP.NET MVC. At the Summit they estimated that MVC would only be applicable for ~10% of all new web development projects. Based on that prediction I questioned why they were investing such considerable resources for such a relative edge case, but my guess is that they felt it was an important edge case at the time as some of the more vocal .NET evangelists as well as some very high profile start-ups ( ie. Twitter ) had publicly announced their intent to use Rails. Microsoft made a lot of noise about MVC. In fact, they focused so much of their messaging and marketing hype around MVC that it appeared that WebForms was essentially dead. Yes, it may have been true that Microsoft continued to invest in WebForms, but from an outside perspective it really appeared that MVC was the only framework getting any real attention. As a result, MVC started to gain market share. An inside source at Microsoft told me that MVC usage has grown at a rate of about 5% per year and now sits at ~30%. Essentially by focusing so much marketing effort on MVC, Microsoft actually created a larger market demand for it.  This is because in the Microsoft ecosystem there is somewhat of a bandwagon mentality amongst developers. If Microsoft spends a lot of time talking about a specific technology, developers get the perception that it must be really important. So rather than choosing the right tool for the job, they often choose the tool with the most marketing hype and then try to sell it to the customer. In 2010, I blogged about the fact that MVC did not make any business sense for the DotNetNuke platform. This was because our ecosystem relied on third party extensions which were dependent on the WebForms model. If we migrated the core to MVC it would mean that all of the third party extensions would no longer be compatible, which would be an irresponsible business decision for us to make at the expense of our users and customers. However, this did not stop the debate from continuing to occur in our ecosystem. Clearly some developers had drunk Microsoft’s Kool-Aid about MVC and were of the mindset, to paraphrase an old Scottish saying, “If its not MVC, it’s crap”. Now, this is a rather ignorant position to take as most of the benefits of MVC can be achieved in WebForms with solid architecture and responsible coding practices. Clean separation of concerns, unit testing, and direct control over page output are all possible in the WebForms model – it just requires diligence and discipline. So over the past few years some horror stories have begun to bubble to the surface of software development projects focused on ground-up rewrites of web applications for the sole purpose of migrating from WebForms to MVC. These large scale rewrites were typically initiated by engineering teams with only a single argument driving the business decision, that Microsoft was promoting MVC as “the future”. These ill-fated rewrites offered no benefit to end users or customers and in fact resulted in a less stable, less scalable and more complicated systems – basically taking one step forward and two full steps back. A case in point is the announcement earlier this week that a popular open source .NET CMS provider has decided to pull the plug on their new MVC product which has been under active development for more than 18 months and revert back to WebForms. The availability of multiple server-side development models has deeply fragmented the Microsoft developer community. Some folks like to compare it to the age-old VB vs. C# language debate. However, the VB vs. C# language debate was ultimately more of a religious war because at least the two dominant programming languages were compatible with one another and could be used interchangeably. The issue with WebForms vs. MVC is much more challenging. This is because the messaging from Microsoft has positioned the two solutions as being incompatible with one another and as a result web developers feel like they are forced to choose one path or another. Yes, it is true that it has always been technically possible to use WebForms and MVC in the same project, but the tooling support has always made this feel “dirty”. The fragmentation has also made it difficult to attract newcomers as the perceived barrier to entry for learning ASP.NET has become higher. As a result many new software developers entering the market are gravitating to environments where the development model seems more simple and intuitive ( ie. PHP or Ruby ). At the same time that the Web Platform team was busy promoting ASP.NET MVC, the Microsoft Office team has been promoting Sharepoint as a platform for building internal enterprise web applications. Sharepoint has great penetration in the enterprise and over time has been enhanced with improved extensibility capabilities for software developers. But, like many other mature enterprise ASP.NET web applications, it is built on the WebForms development model. Similar to DotNetNuke, Sharepoint leverages a rich third party ecosystem for both generic web controls and more specialized WebParts – both of which rely on WebForms. So basically this resulted in a situation where the Web Platform group had headed off in one direction and the Office team had gone in another direction, and the end customer was stuck in the middle trying to figure out what to do with their existing investments in Microsoft technology. It really emphasized the perception that the left hand was not speaking to the right hand, as strategically speaking there did not seem to be any high level plan from Microsoft to ensure consistency and continuity across the different product lines. With the introduction of ASP.NET MVC, it also made some of the third party control vendors scratch their heads, and wonder what the heck Microsoft was thinking. The original value proposition of ASP.NET over Classic ASP was the ability for web developers to emulate the highly productive desktop development model by using abstract components for creating rich, interactive web interfaces. Web control vendors like Telerik, Infragistics, DevExpress, and ComponentArt had all built sizable businesses offering powerful user interface components to WebForms developers. And even after MVC was introduced these vendors continued to improve their products, offering greater productivity and a superior user experience via AJAX to what was possible in MVC. And since many developers were comfortable and satisfied with these third party solutions, the demand remained strong and the third party web control market continued to prosper despite the availability of MVC. While all of this was going on in the Microsoft ecosystem, there has also been a fundamental shift in the general software development industry. Driven by the explosion of Internet-enabled devices, the focus has now centered on service-oriented architecture (SOA). Service-oriented architecture is all about defining a public API for your product that any client can consume; whether it’s a native application running on a smart phone or tablet, a web browser taking advantage of HTML5 and Javascript, or a rich desktop application running on a PC. REST-based services which utilize the less verbose characteristics of JSON as a transport mechanism, have become the preferred approach over older, more bloated SOAP-based techniques. SOA also has the benefit of producing a cross-platform API, as every major technology stack is able to interact with standard REST-based web services. And for web applications, more and more developers are turning to robust Javascript libraries like JQuery and Knockout for browser-based client-side development techniques for calling web services and rendering content to end users. In fact, traditional server-side page rendering has largely fallen out of favor, resulting in decreased demand for server-side frameworks like Ruby on Rails, WebForms, and (gasp) MVC. In response to these new industry trends, Microsoft did what it always does – it immediately poured some resources into developing a solution which will ensure they remain relevant and competitive in the web space. This work culminated in a new framework which was branded as Web API. It is convention-based and designed to embrace native HTTP standards without copious layers of abstraction. This framework is designed to be the ultimate replacement for both the REST aspects of WCF and ASP.NET MVC Web Services. And since it was developed out of band with a dependency only on ASP.NET 4.0, it means that it can be used immediately in a variety of production scenarios. So at Tech Ed 2012 it was made abundantly clear in numerous sessions that Microsoft views Web API as the “Future of ASP.NET”. In fact, one Microsoft PM even went as far as to say that if we look 3-4 years into the future, that all ASP.NET web applications will be developed using the Web API approach. This is a fairly bold prediction and clearly telegraphs where Microsoft plans to allocate its resources going forward. Currently Web API is being delivered as part of the MVC4 package, but this is only temporary for the sake of convenience. It also sounds like there are still internal discussions going on in terms of how to brand the various aspects of ASP.NET going forward – perhaps the moniker of “ASP.NET Web Stack” coined a couple years ago by Scott Hanselman and utilized as part of the open source release of ASP.NET bits on Codeplex a few months back will eventually stick. Web API is being positioned as the unification of ASP.NET – the glue that is able to pull this fragmented mess back together again. The  “One ASP.NET” strategy will promote the use of all frameworks - WebForms, MVC, and Web API, even within the same web project. Basically the message is utilize the appropriate aspects of each framework to solve your business problems. Instead of navigating developers to a fork in the road, the plan is to educate them that “hybrid” applications are a great strategy for delivering solutions to customers. In addition, the service-oriented approach coupled with client-side development promoted by Web API can effectively be used in both WebForms and MVC applications. So this means it is also relevant to application platforms like DotNetNuke and Sharepoint, which means that it starts to create a unified development strategy across all ASP.NET product lines once again. And so what about MVC? There have actually been rumors floated that MVC has reached a stage of maturity where, similar to WebForms, it will be treated more as a maintenance product line going forward ( MVC4 may in fact be the last significant iteration of this framework ). This may sound alarming to some folks who have recently adopted MVC but it really shouldn’t, as both WebForms and MVC will continue to play a vital role in delivering solutions to customers. They will just not be the primary area where Microsoft is spending the majority of its R&D resources. That distinction will obviously go to Web API. And when the question comes up of why not enhance MVC to make it work with Web API, you must take a step back and look at this from the higher level to see that it really makes no sense. MVC is a server-side page compositing framework; whereas, Web API promotes client-side page compositing with a heavy focus on web services. In order to make MVC work well with Web API, would require a complete rewrite of MVC and at the end of the day, there would be no upgrade path for existing MVC applications. So it really does not make much business sense. So what does this have to do with DotNetNuke? Well, around 8-12 months ago we recognized the software industry trends towards web services and client-side development. We decided to utilize a “hybrid” model which would provide compatibility for existing modules while at the same time provide a bridge for developers who wanted to utilize more modern web techniques. Customers who like the productivity and familiarity of WebForms can continue to build custom modules using the traditional approach. However, in DotNetNuke 6.2 we also introduced a new Service Framework which is actually built on top of MVC2 ( we chose to leverage MVC because it had the most intuitive, light-weight REST implementation in the .NET stack ). The Services Framework allowed us to build some rich interactive features in DotNetNuke 6.2, including the Messaging and Notification Center and Activity Feed. But based on where we know Microsoft is heading, it makes sense for the next major version of DotNetNuke ( which is expected to be released in Q4 2012 ) to migrate from MVC2 to Web API. This will likely result in some breaking changes in the Services Framework but we feel it is the best approach for ensuring the platform remains highly modern and relevant. The fact that our development strategy is perfectly aligned with the “One ASP.NET” strategy from Microsoft means that our customers and developer community can be confident in their current and future investments in the DotNetNuke platform.

    Read the article

  • Making an AJAX WCF Web Service request during an Async Postback

    - by nekno
    I want to provide status updates during a long-running task on an ASP.NET WebForms page with AJAX. Is there a way to get the ScriptManager to execute and process a script for a web service request during an async postback? I have a script on the page that makes a web service request. It runs on page load and periodically using setInterval(). It's running correctly before the async postback is initiated, but it stops running during the async postback, and doesn't run again until after the async postback completes. I have an UpdatePanel with a button to trigger an async postback, which executes the long-running task. I also have an instance of an AJAX WCF Web service that is working correctly to fetch data and present it on the page but, like I said, it doesn't fetch and present the data until after the async postback completes. During the async postback, the long-running task sends updates from the page to the web service. The problem is that I can debug and step through the web service and see that the status updates are correctly set, but the updates aren't retrieved by the client script until the async postback completes. It seems the Script Manager is busy executing the async postback, so it doesn't run my other JavaScript via setInterval() until the postback completes. Is there a way to get the Script Manager, or otherwise, to run the script to fetch data from the WCF web service during the async postback? I've tried various methods of using the PageRequestManager to run the script on the client-side BeginRequest event for the async postback, but it runs the script, then stops processing the code that should be running via setInterval() while the page request executes.

    Read the article

  • Making a concurrent AJAX WCF Web Service request during an Async Postback

    - by nekno
    I want to provide status updates during a long-running task on an ASP.NET WebForms page with AJAX. Is there a way to get the ScriptManager to execute and process a script for a web service request concurrently with an async postback? I have a script on the page that makes a web service request. It runs on page load and periodically using setInterval(). It's running correctly before the async postback is initiated, but it stops running during the async postback, and doesn't run again until after the async postback completes. I have an UpdatePanel with a button to trigger an async postback, which executes the long-running task. I also have an instance of an AJAX WCF Web service that is working correctly to fetch data and present it on the page but, like I said, it doesn't fetch and present the data until after the async postback completes. During the async postback, the long-running task sends updates from the page to the web service. The problem is that I can debug and step through the web service and see that the status updates are correctly set, but the updates aren't retrieved by the client script until the async postback completes. It seems the Script Manager is busy executing the async postback, so it doesn't run my other JavaScript via setInterval() until the postback completes. Is there a way to get the Script Manager, or otherwise, to run the script to fetch data from the WCF web service during the async postback? I've tried various methods of using the PageRequestManager to run the script on the client-side BeginRequest event for the async postback, but it runs the script, then stops processing the code that should be running via setInterval() while the page request executes.

    Read the article

  • Intermittent bug - IE6 showing file as text in browser, rather than as file download

    - by Richard Ev
    In an ASP.NET WebForms 2.0 site we are encountering an intermittent bug in IE6 whereby a file download attempt results in the contents of the being shown directly in the browser as text, rather than the file save dialog being displayed. Our application allows the user to download both PDF and CSV files. The code we're using is: HttpResponse response = HttpContext.Current.Response; response.Clear(); response.AddHeader("Content-Disposition", "attachment;filename=\"theFilename.pdf\""); response.ContentType = "application/pdf"; response.BinaryWrite(MethodThatReturnsFileContents()); response.End(); This is called from the code-behind click event handler of a button server control. Where are we going wrong with this approach? Edit Following James' answer to this posting, the code I'm using now looks like this: HttpResponse response = HttpContext.Current.Response; response.ClearHeaders(); // Setting cache to NoCache was recommended, but doing so results in a security // warning in IE6 //response.Cache.SetCacheability(HttpCacheability.NoCache); response.AppendHeader("Content-Disposition", "attachment; filename=\"theFilename.pdf\""); response.ContentType = "application/pdf"; response.BinaryWrite(MethodThatReturnsFileContents()); response.Flush(); response.End(); However, I don't believe that any of the changes made will fix the issue.

    Read the article

  • Ninject.Web.PageBase still resulting in null reference to injected dependency

    - by Ted
    I have an ASP.NET 3.5 WebForms application using Ninject 2.0. However, attempting to use the Ninject.Web extension to provide injection into System.Web.UI.Page, I'm getting a null reference to my injected dependency even though if I switch to using a service locator to provide the reference (using Ninject), there's no issue. My configuration (dumbed down for simplicity): public partial class Default : PageBase // which is Ninject.Web.PageBase { [Inject] public IClubRepository Repository { get; set; } protected void Page_Load(object sender, EventArgs e) { var something = Repository.GetById(1); // results in null reference exception. } } ... //global.asax.cs public class Global : Ninject.Web.NinjectHttpApplication { /// <summary> /// Creates a Ninject kernel that will be used to inject objects. /// </summary> /// <returns> /// The created kernel. /// </returns> protected override IKernel CreateKernel() { IKernel kernel = new StandardKernel(new MyModule()); return kernel; } .. ... public class MyModule : NinjectModule { public override void Load() { Bind<IClubRepository>().To<ClubRepository>(); //... } } Getting the IClubRepository concrete instance via a service locator works fine (uses same "MyModule"). I.e. private readonly IClubRepository _repository = Core.Infrastructure.IoC.TypeResolver.Get<IClubRepository>(); What am I missing? [Update] Finally got back to this, and it works in Classic Pipeline mode, but not Integrated. Is the classic pipeline a requirement? [Update 2] Wiring up my OnePerRequestModule was the problem (which had removed in above example for clarity): protected override IKernel CreateKernel() { var module = new OnePerRequestModule(); module.Init(this); IKernel kernel = new StandardKernel(new MyModule()); return kernel; } ...needs to be: protected override IKernel CreateKernel() { IKernel kernel = new StandardKernel(new MyModule()); var module = new OnePerRequestModule(); module.Init(this); return kernel; } Thus explaining why I was getting a null reference exception under integrated pipeline (to a Ninject injected dependency, or just a page load for a page inheriting from Ninject.Web.PageBase - whatever came first).

    Read the article

  • Parallel.For maintain input list order on output list

    - by romeozor
    I'd like some input on keeping the order of a list during heavy-duty operations that I decided to try to do in a parallel manner to see if it boosts performance. (It did!) I came up with a solution, but since this was my first attempt at anything parallel, I'd need someone to slap my hands if I did something very stupid. There's a query that returns a list of card owners, sorted by name, then by date of birth. This needs to be rendered in a table on a web page (ASP.Net WebForms). The original coder decided he would construct the table cell-by-cell (TableCell), add them to rows (TableRow), then each row to the table. So no GridView, allegedly its performance is bad, but the performance was very poor regardless :). The database query returns in no time, the most time is spent on looping through the results and adding table cells etc. I made the following method to maintain the original order of the list: private TableRow[] ComposeRows(List<CardHolder> queryResult) { int queryElementsCount = queryResult.Count(); // array with the query's size var rowArray = new TableRow[queryElementsCount]; Parallel.For(0, queryElementsCount, i => { var row = new TableRow(); var cell = new TableCell(); // various operations, including simple ones such as: cell.Text = queryResult[i].Name; row.Cells.Add(cell); // here I'm adding the current item to it's original index // to maintain order in the output list rowArray[i] = row; }); return rowArray; } So as you can see, because I'm returning a very different type of data (List<CardHolder> -> TableRow[]), I can't just simply omit the ordering from the original query to do it after the operations. Also, I also thought it would be a good idea to Dispose() the objects at the end of each loop, because the query can return a huge list and letting cell and row objects pile up in the heap could impact performance.(?) How badly did I do? Does anyone have a better solution in case mine is flawed?

    Read the article

  • jQuery DataTables server side processing and ASP.Net

    - by Chad
    I'm trying to use the server side functionality of the jQuery Datatables plugin with ASP.Net. The ajax request is returning valid JSON, but nothing is showing up in the table. I originally had problems with the data I was sending in the ajax request. I was getting a "Invalid JSON primative" error. I discovered that the data needs to be in a string instead of JSON serialized, as described in this post: http://encosia.com/2008/06/05/3-mistakes-to-avoid-when-using-jquery-with-aspnet-ajax/. I wasn't quite sure how to fix that, so I tried adding this in the ajax request: "data": "{'sEcho': '" + aoData.sEcho + "'}" If the aboves eventually works I'll add the other parameters later. Right now I'm just trying to get something to show up in my table. The returning JSON looks ok and validates, but the sEcho in the post is undefined, and I think thats why no data is being loaded into the table. So, what am I doing wrong? Am I even on the right track or am I being stupid? Does anyone ran into this before or have any suggestions? Here's my jQuery: $(document).ready(function() { $("#grid").dataTable({ "bJQueryUI": true, "sPaginationType": "full_numbers", "bServerSide":true, "sAjaxSource": "GridTest.asmx/ServerSideTest", "fnServerData": function(sSource, aoData, fnCallback) { $.ajax({ "type": "POST", "dataType": 'json', "contentType": "application/json; charset=utf-8", "url": sSource, "data": "{'sEcho': '" + aoData.sEcho + "'}", "success": fnCallback }); } }); }); HTML: <table id="grid"> <thead> <tr> <th>Last Name</th> <th>First Name</th> <th>UserID</th> </tr> </thead> <tbody> <tr> <td colspan="5" class="dataTables_empty">Loading data from server</td> </tr> </tbody> </table> Webmethod: <WebMethod()> _ Public Function ServerSideTest() As Data Dim list As New List(Of String) list.Add("testing") list.Add("chad") list.Add("testing") Dim container As New List(Of List(Of String)) container.Add(list) list = New List(Of String) list.Add("testing2") list.Add("chad") list.Add("testing") container.Add(list) HttpContext.Current.Response.ContentType = "application/json" Return New Data(HttpContext.Current.Request("sEcho"), 2, 2, container) End Function Public Class Data Private _iTotalRecords As Integer Private _iTotalDisplayRecords As Integer Private _sEcho As Integer Private _sColumns As String Private _aaData As List(Of List(Of String)) Public Property sEcho() As Integer Get Return _sEcho End Get Set(ByVal value As Integer) _sEcho = value End Set End Property Public Property iTotalRecords() As Integer Get Return _iTotalRecords End Get Set(ByVal value As Integer) _iTotalRecords = value End Set End Property Public Property iTotalDisplayRecords() As Integer Get Return _iTotalDisplayRecords End Get Set(ByVal value As Integer) _iTotalDisplayRecords = value End Set End Property Public Property aaData() As List(Of List(Of String)) Get Return _aaData End Get Set(ByVal value As List(Of List(Of String))) _aaData = value End Set End Property Public Sub New(ByVal sEcho As Integer, ByVal iTotalRecords As Integer, ByVal iTotalDisplayRecords As Integer, ByVal aaData As List(Of List(Of String))) If sEcho <> 0 Then Me.sEcho = sEcho Me.iTotalRecords = iTotalRecords Me.iTotalDisplayRecords = iTotalDisplayRecords Me.aaData = aaData End Sub Returned JSON: {"__type":"Data","sEcho":0,"iTotalRecords":2,"iTotalDisplayRecords":2,"aaData":[["testing","chad","testing"],["testing2","chad","testing"]]}

    Read the article

  • Issue with IIS...SERVER APPLICATION UNAVAILABLE

    - by SVI
    My web application was running absolutely fine. Just 2 days back, I got an error saying. SERVER APPLICATION UNAVAILABLE I am pretty certain that nothing was changed on IIS. Unless my automatic Windows updates screwed it up completely. My event viewer had zillions of following errors in Application category. aspnet_wp.exe could not be started. The error code for the failure is C0000005. This error can be caused when the worker process account has insufficient rights to read the .NET Framework files. Please ensure that the .NET Framework is correctly installed and that the ACLs on the installation directory allow access to the configured account. I reinstalled IIS. After installing, i ran aspnet_regiis -i for framework v2 and now it throws error saying - The application could not be initialized properly. Any ideas what going on?

    Read the article

  • How to serve a View as CSV in ASP.NET Web Forms

    - by ChessWhiz
    Hi, I have a MS SQL view that I want to make available as a CSV download in my ASPNET Web Forms app. I am using Entity Framework for other views and tables in the project. What's the best way to enable this download? I could add a HyperLink whose click handler iterates over the view, writes its CSV form to the disk, and then serves that file. However, I'd prefer not to write to the disk if it can be avoided, and that involves iteration code that may be avoided with some other solution. Any ideas?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >