Search Results

Search found 2132 results on 86 pages for 'serve chilled'.

Page 19/86 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Asynchronous pages in the ASP.NET framework - where are the other threads and how is it reattached?

    - by rkrauter
    Sorry for this dumb question on Asynchronous operations. This is how I understand it. IIS has a limited set of worker threads waiting for requests. If one request is a long running operation, it will block that thread. This leads to fewer threads to serve requests. Way to fix this - use asynchronous pages. When a request comes in, the main worker thread is freed and this other thread is created in some other place. The main thread is thus able to serve other requests. When the request completes on this other thread, another thread is picked from the main thread pool and the response is sent back to the client. 1) Where are these other threads located? 2) IF ASP.NET likes creating new threads, why not increase the number of threads in the main worker pool - they are all running on the same machine anyway? 3) If the main thread hands off a request to this other thread, why does the request not get disconnected? It magically hands off the request to another worker thread somewhere else and when the long running process completes, it picks a thread from the main worker pool and sends response to the client. I am amazed...but how does that work?

    Read the article

  • Reloading Sinatra app on every request on Windows

    - by Darth
    I've set up Rack::Reload according to this thread # config.ru require 'rubygems' require 'sinatra' set :environment, :development require 'app' run Sinatra::Application # app.rb class Sinatra::Reloader < Rack::Reloader def safe_load(file, mtime, stderr = $stderr) if file == Sinatra::Application.app_file ::Sinatra::Application.reset! stderr.puts "#{self.class}: reseting routes" end super end end configure(:development) { use Sinatra::Reloader } get '/' do 'foo' end Running with thin via thin start -R config.ru, but it only reloads newly added routes. When I change already existing route, it still runs the old code. When I add new route, it correctly reloads it, so it is accessible, but it doesn't reload anything else. For example, if I changed routes to get '/' do 'bar' end get '/foo' do 'baz' end Than / would still serve foo, even though it has changed, but /foo would correctly reload and serve baz. Is this normal behavior, or am I missing something? I'd expect whole source file to be reloaded. The only way around I can think of right now is restarting whole webserver when filesystem changes. I'm running on Windows Vista x64, so I can't use shotgun because of fork().

    Read the article

  • File Uploading In Google Application Engine Using Django

    - by Ayush
    I am using gae with django. I have an project named MusicSite with following url mapping- urls.py from django.conf.urls.defaults import * from MusicSite.views import MainHandler from MusicSite.views import UploadHandler from MusicSite.views import ServeHandler urlpatterns = patterns('',(r'^start/', MainHandler), (r'^upload/', UploadHandler), (r'^/serve/([^/]+)?', ServeHandler), ) There is an application MusicSite inside MusicFun with the following codes- views.py import os import urllib from google.appengine.ext import blobstore from google.appengine.ext import webapp from google.appengine.ext.webapp import blobstore_handlers from google.appengine.ext.webapp import template from google.appengine.ext.webapp.util import run_wsgi_app def MainHandler(request): response=HttpResponse() upload_url = blobstore.create_upload_url('http://localhost: 8000/upload/') response.write('') response.write('' % upload_url) response.write("""Upload File: """) return HttpResponse(response) def UploadHandler(request): upload_files=request.FILES['file'] blob_info = upload_files[0] response.redirect('http://localhost:8000/serve/%s' % blob_info.key()) class ServeHandler(blobstore_handlers.BlobstoreDownloadHandler): def get(self, resource): resource = str(urllib.unquote(resource)) blob_info = blobstore.BlobInfo.get(resource) self.send_blob(blob_info) now whenever a upload a file using /start and click Submit i am taken to a blank page with the following url- localhost:8000/_ah/upload/ahhnb29nbGUtYXBwLWVuZ2luZS1kamFuZ29yGwsSFV9fQmxvYlVwbG9hZFNlc3Npb25fXxgHDA These random alphabets keep varying but the result is same. A blank page after every upload. Somebody please help. The server responses are as below- INFO:root:"GET /start/ HTTP/1.1" 200 - INFO:root:"GET /favicon.ico HTTP/1.1" 404 - INFO:root:Internal redirection to http://localhost:8000/upload/ INFO:root:Upload handler returned 500 ERROR:root:Invalid upload handler response. Only 301, 302 and 303 statuses are permitted and it may not have a content body. INFO:root:"POST /_ah/upload/ ahhnb29nbGUtYXBwLWVuZ2luZS1kamFuZ29yGwsSFV9fQmxvYlVwbG9hZFNlc3Npb25fXxgCDA HTTP/1.1" 500 - INFO:root:"GET /favicon.ico HTTP/1.1" 404 -

    Read the article

  • Using {% url ??? %} in django templates

    - by user563247
    I have looked a lot on google for answers of how to use the 'url' tag in templates only to find many responses saying 'You just insert it into your template and point it at the view you want the url for'. Well no joy for me :( I have tried every permutation possible and have resorted to posting here as a last resort. So here it is. My urls.py looks like this: from django.conf.urls.defaults import * from login.views import * from mainapp.views import * import settings # Uncomment the next two lines to enable the admin: from django.contrib import admin admin.autodiscover() urlpatterns = patterns('', # Example: # (r'^weclaim/', include('weclaim.foo.urls')), (r'^login/', login_view), (r'^logout/', logout_view), ('^$', main_view), # Uncomment the admin/doc line below and add 'django.contrib.admindocs' # to INSTALLED_APPS to enable admin documentation: # (r'^admin/doc/', include('django.contrib.admindocs.urls')), # Uncomment the next line to enable the admin: (r'^admin/', include(admin.site.urls)), #(r'^static/(?P<path>.*)$', 'django.views.static.serve',{'document_root': '/home/arthur/Software/django/weclaim/templates/static'}), (r'^static/(?P<path>.*)$', 'django.views.static.serve',{'document_root': settings.MEDIA_ROOT}), ) My 'views.py' in my 'login' directory looks like: from django.shortcuts import render_to_response, redirect from django.template import RequestContext from django.contrib import auth def login_view(request): if request.method == 'POST': uname = request.POST.get('username', '') psword = request.POST.get('password', '') user = auth.authenticate(username=uname, password=psword) # if the user logs in and is active if user is not None and user.is_active: auth.login(request, user) return render_to_response('main/main.html', {}, context_instance=RequestContext(request)) #return redirect(main_view) else: return render_to_response('loginpage.html', {'box_width': '402', 'login_failed': '1',}, context_instance=RequestContext(request)) else: return render_to_response('loginpage.html', {'box_width': '400',}, context_instance=RequestContext(request)) def logout_view(request): auth.logout(request) return render_to_response('loginpage.html', {'box_width': '402', 'logged_out': '1',}, context_instance=RequestContext(request)) and finally the main.html to which the login_view points looks like: <html> <body> test! <a href="{% url logout_view %}">logout</a> </body> </html> So why do I get 'NoReverseMatch' every time? *(on a slightly different note I had to use 'context_instance=RequestContext(request)' at the end of all my render-to-response's because otherwise it would not recognise {{ MEDIA_URL }} in my templates and I couldn't reference any css or js files. I'm not to sure why this is. Doesn't seem right to me)*

    Read the article

  • Losing 'post' requests sent to Pylons paster server

    - by Philip McDermott
    I'm sending post requests to a Pylons server (served by paster serve), and if I send them with any frequency many don't arrive at the server. One at a time is ok, but if I fire off a few (or more) within seconds, only a small number get dealt with. If I send with no post data, or with get, it works fine, but putting just one character of data in the post fields causes massive losses. For example, sending 200, 2 will come back. Sending 100 more slowly, 10 will come back. I'm making the requests form inside a Qt application. Tis will work ok (no data): QString postFields = "" QNetworkRequest request(QUrl("http://server.com/endpoint")); QNetworkReply *reply = networkAccessManager-post(request, postFields.toAscii()); And this will result in only a fraction of the requests being dealt with: QString postFields = "" QNetworkRequest request(QUrl("http://server.com/endpoint")); QNetworkReply *reply = networkAccessManager-post(request, postFields.toAscii()); I've played around with turning on use_threadpool, and other options (threadpool_workers, threadpool_max_requests = 300), of which some combinations can alter the results slightly (best case 10 responses in 200). If I send similar requests to other (non paster) servers, the replies come back ok, so I'm almost certain its'a paster serve config issue. Any help or advice greatly appreciated. Thanks Philip

    Read the article

  • Get remote image using cURL then resample.

    - by Chris
    I want to be able to retrieve a remote image from a webserver, resample it, and then serve it up to the browser AND save it to a file. Here is what I have so far: $ch = curl_init(); // set URL and other appropriate options curl_setopt($ch, CURLOPT_URL, "$rURL"); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_HEADER, 0); // grab URL and pass it to the browser $out = curl_exec($ch); // close cURL resource, and free up system resources curl_close($ch); $imgRes = imagecreatefromstring($out); imagejpeg($imgRes, $filename, 70); header("Content-Type: image/jpg"); header("Content-length: ".filesize($filename)); header("Content-Transfer-Encoding: binary"); header("Content-Length: ".filesize($filename)); readfile("$filename"); exit(); Update Updated code to include imjpeg step to save the image as lower quality. But how do I then, efficiently, serve this up to the browser. I currently, later in the code, do this readfile("$filename"); along with some header information but that means I'm reading the file back in again which seems inefficient.

    Read the article

  • Javascript with Django?

    - by Rosarch
    I know this has been asked before, but I'm having a hard time setting up JS on my Django web app, even though I'm reading the documentation. I'm running the Django dev server. My file structure looks like this: mysite/ __init__.py MySiteDB manage.py settings.py urls.py myapp/ __init__.py admin.py models.py test.py views.py templates/ index.html Where do I want to put the Javascript and CSS? I've tried it in a bunch of places, including myapp/, templates/ and mysite/, but none seem to work. From index.html: <head> <title>Degree Planner</title> <script type="text/javascript" src="/scripts/JQuery.js"></script> <script type="text/javascript" src="/media/scripts/sprintf.js"></script> <script type="text/javascript" src="/media/scripts/clientside.js"></script> </head> From urls.py: (r'^admin/', include(admin.site.urls)), (r'^media/(?P<path>.*)$', 'django.views.static.serve', {'document_root': 'media'}) (r'^.*', 'mysite.myapp.views.index'), I suspect that the serve() line is the cause of errors like: TypeError at /admin/auth/ 'tuple' object is not callable Just to round off the rampant flailing, I changed these settings in settings.py: MEDIA_ROOT = '/media/' MEDIA_URL = 'http://127.0.0.1:8000/media'

    Read the article

  • SAML Identity Provider based on Active Directory

    - by Jarret
    I have a 3rd party program that supports web SSO using SAML 1.1 (it is ready to serve as the Service Provider, in other words). We would like to implement this SSO for our intranet users based on their Active Directory credentials. In other words, they've already logged on to their system, so let's simply use those credentials to facilitate an SSO. I am a little overwhelmed at where to begin, though. My initial thought is that IIS / Active Directory could easily serve as the Identity Provider since IIS gives us "Integrated Windows Authentication" abilities. I would think we could just create a .NET web app that requires Integrated Authentication which simply extracts the current user ID, builds the SAML response, and re-directs the user back to the Service Provider with this SAML response to complete the SSO. But then, my problem is that I simply have no real idea of how to go about creating this SAML response, the X.509 certs involved, etc... I am wondering if I am in over my head on this, or if creating this SAML response should be relatively easy. Note this SSO is to be used by intranet users only, so no need to worry about federating with other companies / domains.

    Read the article

  • Building asynchronous cache pattern with JSP

    - by merweirdo
    I have a JSP that will take some 8 minutes to render. The code logic itself can not be made more efficient (it will update often and be updated by basically a pointy haired boss). I tried wrapping it with a caching layer like <%@ taglib uri="/WEB-INF/classes/oscache.tld" prefix="oscache" %> <oscache:cache time="60"> <div class="pagecontent"> ..... my logic </div> </oscache:cache> This is nice until the 60 seconds is over. The next query after that blocks until the 8 minutes of rendering is done with again. I would need a way to build a pattern something like: If there is no version of the dynamic content in the cache run the actual logic (and populate the cache for subsequent requests) If there is a non-expired version of the dynamic content in the cache serve the output of the JSP logic from the cache If there is an expired version of the dynamic content in the cache serve the output of the JSP logic still from the cache AND run the JSP logic in the background so that the cache gets updated transparently to the user - avoiding the user have to wait for 8 minutes I found out that at least EHCache might be able to do some asynchronous cache updating but it did not sadly seem to apply to the JSP tags... Also I have to take in 10-20 parameters for the actual logic of the JSP and some of them should be used as a key for caching. Code example and/or pointers would be greatly appreciated. I do not frankly care if the solution provided is extremely ugly. I just want a simple 5 minute caching with asynchronous cache update taking into account some parameters as a key.

    Read the article

  • How to detect if a file is PDF or TIFF ?

    - by eviljack
    Please bear with me as I've been thrown into the middle of this project without knowing all the background. If you've got WTF questions, trust me, I have them too. Here is the scenario: I've got a bunch of files residing on an IIS server. They have no file extension on them. Just naked files with names like "asda-2342-sd3rs-asd24-ut57" and so on. Nothing intuitive. The problem is I need to serve up files on an ASP.NET (2.0) page and display the tiff files as tiff and the PDF files as PDF. Unfortunately I don't know which is which and I need to be able to display them appropriately in their respective formats. For example, lets say that there are 2 files I need to display, one is tiff and one is PDF. The page should show up with a tiff image, and perhaps a link that would open up the PDF in a new tab/window. The problem: As these files are all extension-less I had to force IIS to just serve everything up as TIFF. But if I do this, the PDF files won't display. I could change IIS to force the MIME type to be PDF for unknown file extensions but I'd have the reverse problem. http://support.microsoft.com/kb/326965 Is this problem easier than I think or is it as nasty as I am expecting?

    Read the article

  • Has anyone setup tomcat to run virtual hosts using mod_jk

    - by Adam
    I work in OSX primarily with mostly PHP. Normally I work locally using MAMP and virtual hosts setup in my httpd.conf so that I can point a browser to http://some-project and have as many projects as I need setup. We have a project coming up where we need to serve JSP pages and I would like to set up my local apache server to serve only JSP files to Tomcat and everything else to MAMP using the same virtual hosts setup in: ~/applications/MAMP/conf/apache/httpd.conf So far I have: Successfully installed Tomcat Placed mod_jd.so in ~/applications/MAMP/Library/modules/mod_jk.so Added the module by placing: LoadModule jk_module modules/mod_jk.so in ~/applications/MAMP/conf/apache/httpd.conf Created /Library/Tomcat/Home/conf/jk/workers.properties and added the following lines: workers.tomcat_home=/Library/Tomcat workers.java_home=/System/Library/Frameworks/JavaVM.framework/Versions/1.5.0/Home ps=/ worker.list=ajp12, ajp13 worker.ajp13.port=8009 worker.ajp13.host=localhost worker.ajp12.type=ajp13 worker.ajp13.mount=/*.jsp added the following lines: JkWorkersFile /Library/Tomcat/Home/conf/workers.properties JkLogFile /Library/Tomcat/Home/logs/mod_jk.log JkLogLevel debug to ~/applications/MAMP/conf/apache/httpd.conf I cannot start my MAMP however when these last two lines are present in my httpd.conf. Does anyone work like this? Any tips? Any clear ideas of what I'm doing wrong?

    Read the article

  • Is www.example.com/post/21/edit a RESTful URI? I think I know the answer, but have another question.

    - by tmadsen
    I'm almost afraid to post this question, there has to be an obvious answer I've overlooked, but here I go: Context: I am creating a blog for educational purposes (want to learn python and web.py). I've decided that my blog have posts, so I've created a Post class. I've also decided that posts can be created, read, updated, or deleted (so CRUD). So in my Post class, I've created methods that respond to POST, GET, PUT, and DELETE HTTP methods). So far so good. The current problem I'm having is a conceptual one, I know that sending a PUT HTTP message (with an edited Post) to, e.g., /post/52 should update post with id 52 with the body contents of the HTTP message. What I do not know is how to conceptually correctly serve the (HTML) edit page. Will doing it like this: /post/52/edit violate the idea of URI, as 'edit' is not a resource, but an action? On the other side though, could it be considered a resource since all that URI will respond to is a GET method, that will only return an HTML page? So my ultimate question is this: How do I serve an HTML page intended for user editing in a RESTful manner?

    Read the article

  • Pushing a local mercurial repository to a remote server or cloning at server from local

    - by Samaursa
    I have a local repository that I have now decided to push to a remote server (for example, I have a host that allows mercurial repositories and I am also trying to push to bitbucket). The repository has a lot of files and is a little more than 200mb. Locally, I am able to clone the repository without problems. Now I have a lot of changes in this repository, and I have wasted a couple of days trying to figure out how to get the remote server to clone my repository. I cannot get hg serve to work outside of the LAN. I have tried everything. So instead, I created a new repository at the remote servers (both at the host and bitbucket) with nothing in it. Now I am pushing the complete repository that I have locally to these remote locations. So far it has been unsuccessful, as the push operation is stuck on searching for changes and does not give me any other useful output. I have let it go for about an hour with no change. Now my questions is, what am I doing wrong as far as hg serve is concerned? I can access it locally but not remotely (through DynDns - I have configured it properly and the router forwards the ports correctly) so that I can get the server to clone the repository the first time after which I will be pushing to it. My second question is, assuming the clone at server does not work (for example, if I was to push my current repository to bitbucket), is creating an empty repository at the server and then pushing a local repository to the new remote repository ok? Is that the source of the searching for changes problem? Any help in this regard would be greatly appreciated.

    Read the article

  • PHP question about global variables and form requests

    - by user220201
    Hi, This is probably a stupid question but will ask anyway sine I have no idea. I have written basic php code which serve forms. Say I have a login page and I serve it using the login.php page and it will be called in the login.html page like this - <form action="login.php" method="post"> By this it is also implied that every POST needs its own php file, doesn't it? This kind of feels weird. Is there a way to have a single file, say code.php, and just have each of the forms as functions instead? EDIT: Specifically, say I have 5 forms that are used one after the other in my application. Say after login the user does A, B, C and D tasks each of which are sent to the server as a POST request. So instead of having A.php, B.php, C.php and D.php I would like to have a single code.php and have A(), B(), C() and D() as functions. Is there a way to do this? Also on the same note, how do I deal with say a global array (e.g. an array of currently logged in users) across multiple forms? I want to do this without writing to a DB. I know its probably better to write to a DB and query but is it even possible to do it with a global array? The reason I was thinking about having all the form functions in one file is to use a global array. Thanks, - Pav

    Read the article

  • CDN for Images in ASP.NET

    - by Chris
    I am in the process of moving all of the images in my web application over to a CDN but I want to easily be able to switch the CDN on or off without having to hard code the path to the images. My first thought was to add an HttpHandler for image extensions that depending whether a variable in the web.config (something like ) will serve the image from the server or from the CDN. But after giving this a little though I think I've essentially ruled this out as it will cause ASP.NET to handle the request for every single image, thus adding overhead, and it might actually completely mitigate the benefits of using a CDN. An alternative approach is, since all of my pages inherit from a base page class, I could create a function in the base class that determines what path to serve the files from based off the web.config variable. I would then do something like this in the markup: <img src='<%= GetImagePath()/image.png' /> I think this is probably what I'll have to end up doing, but it seems a little clunky to me. I also envision problems with the old .NET error of not being able to modify the control collection because of the "<%=" though the "<%#" solution will probably work. Any thoughts or ideas on how to implement this?

    Read the article

  • Serving .docx files through Php

    - by user275074
    Hi, I'm having issues when attempting to serve a .docx file using Php. When uploading the file I detect the file mime type and upload the file using the file with the correct extension based on the mime type; e.g. below: application/msword - doc application/vnd.openxmlformats-officedocument.wordprocessingml.document - docx When attempting to serve the files for download, I do the reverse in detecting the extension and serving based on the mime type e.g. public static function fileMimeType($extention) { if(!is_null($extention)) { switch($extention) { case 'txt': return 'text/plain'; break; case 'odt': return 'application/vnd.oasis.opendocument.text'; break; case 'doc': return 'application/msword'; break; case 'docx': return 'application/vnd.openxmlformats-officedocument.wordprocessingml.document'; break; case ('jpg' || 'jpeg'): return 'image/jpeg'; break; case 'png': return 'image/png'; break; case 'pdf': return 'application/pdf'; break; default: break; } } } All files appear to download correctly and open fine but when attempting to open a docx file, Word (on multiple files) throws a error stating the file is corrupt. Any ideas would be great, thanks.

    Read the article

  • Is there any way to configure what reCAPTCHA is actually displaying?

    - by trejder
    Is there any way to control, what kind of image is displayed to user in reCAPTCHA or what kind of puzzle he/she is required to solve? I noticed at least two significant changes to what reCAPTCHA is serving (and I must admit, that I don't much like these changes): For years reCAPTCHA was serving two words from scanned books and user was required to solve one of them. They were clearly readable (even those "second" ones, that could be ommitted) and with nearly no problem in solving them by a human. For past few month, I noticed a significant change at all of my sites, that are using reCAPTCHA. They started to show combination of computer-generated long numbers string and something, that looks for me as street/house number photographed in Google StreetView. They're even easier to solve, but what is most important -- it started to happen more and more often that user is obligated to solve both of them. Now, I have noticed another change/regression. Now some of my sites remain at so called "level 2" (like above) and some of them started to serve two words again ("level 1"?). And again, there are more and more situations, where solving both words is required. But, what is most important, on this "level" words are nearly impossible to solve (on my old mobile devices with 3.5'' display I need 5-6 attempts to pass on!). They're cluttered, written in some strange font, mostly in italics with a lot of black and white stains or drops on letters etc. Plus: reCAPTCHA stopped to be equal -- some of my pages are still serving "level2" while some of them are "killing" end users with a need to solve "level3". Is there anyway, I can control this -- force it to use only "level2" and on all my pages? (of course, I'm using exactly the same piece of code to serve reCAPTCHA on all my pages) Note, that I'm not asking for something like in this question. I don't want to change what reCAPTCHA shows (to disable words in favor of only numbers for example). I only want to control, which "version" of puzzles (among described above) reCAPTCHA shows and I want to make it equal on all my sites.

    Read the article

  • Is this method of static file serving safe in node.js? (potential security hole?)

    - by MikeC8
    I want to create the simplest node.js server to serve static files. Here's what I came up with: fs = require('fs'); server = require('http').createServer(function(req, res) { res.end(fs.readFileSync(__dirname + '/public/' + req.url)); }); server.listen(8080); Clearly this would map http://localhost:8080/index.html to project_dir/public/index.html, and similarly so for all other files. My one concern is that someone could abuse this to access files outside of project_dir/public. Something like this, for example: http://localhost:8080/../../sensitive_file.txt I tried this a little bit, and it wasn't working. But, it seems like my browser was removing the ".." itself. Which leads me to believe that someone could abuse my poor little node.js server. I know there are npm packages that do static file serving. But I'm actually curious to write my own here. So my questions are: Is this safe? If so, why? If not, why not? And, if further, if not, what is the "right" way to do this? My one constraint is I don't want to have to have an if clause for each possible file, I want the server to serve whatever files I throw in a directory.

    Read the article

  • How can I measure my (SAMP) server's bandwidth usage?

    - by enkrates
    I'm running a Solaris server to serve PHP through Apache. What tools can I use to measure the bandwidth my server is currently using? I use Google analytics to measure traffic, but as far as I know, it ignores file size. I have a rough idea of the average size of the pages I serve, and can do a back-of-the-envelope calculation of my bandwidth usage by multiplying page views (from Google) by average page size, but I'm looking for a solution that is more rigorous and exact. Also, I'm not trying to throttle anything, or implement usage caps or anything like that. I'd just like to measure the bandwidth usage, so I know what it is. An example of what I'm after is the usage meter that Slicehost provides in their admin website for their users. They tell me (for another site I run) how much bandwidth I've used each month and also divide the usage for uploading and downloading. So, it seems like this data can be measured, and I'd like to be able to do it myself. To put it simply, what is the conventional method for measuring the bandwidth usage of my server?

    Read the article

  • What’s new in ASP.NET 4.0: Core Features

    - by Rick Strahl
    Microsoft released the .NET Runtime 4.0 and with it comes a brand spanking new version of ASP.NET – version 4.0 – which provides an incremental set of improvements to an already powerful platform. .NET 4.0 is a full release of the .NET Framework, unlike version 3.5, which was merely a set of library updates on top of the .NET Framework version 2.0. Because of this full framework revision, there has been a welcome bit of consolidation of assemblies and configuration settings. The full runtime version change to 4.0 also means that you have to explicitly pick version 4.0 of the runtime when you create a new Application Pool in IIS, unlike .NET 3.5, which actually requires version 2.0 of the runtime. In this first of two parts I'll take a look at some of the changes in the core ASP.NET runtime. In the next edition I'll go over improvements in Web Forms and Visual Studio. Core Engine Features Most of the high profile improvements in ASP.NET have to do with Web Forms, but there are a few gems in the core runtime that should make life easier for ASP.NET developers. The following list describes some of the things I've found useful among the new features. Clean web.config Files Are Back! If you've been using ASP.NET 3.5, you probably have noticed that the web.config file has turned into quite a mess of configuration settings between all the custom handler and module mappings for the various web server versions. Part of the reason for this mess is that .NET 3.5 is a collection of add-on components running on top of the .NET Runtime 2.0 and so almost all of the new features of .NET 3.5 where essentially introduced as custom modules and handlers that had to be explicitly configured in the config file. Because the core runtime didn't rev with 3.5, all those configuration options couldn't be moved up to other configuration files in the system chain. With version 4.0 a consolidation was possible, and the result is a much simpler web.config file by default. A default empty ASP.NET 4.0 Web Forms project looks like this: <?xml version="1.0"?> <configuration> <system.web> <compilation debug="true" targetFramework="4.0" /> </system.web> </configuration> Need I say more? Configuration Transformation Files to Manage Configurations and Application Packaging ASP.NET 4.0 introduces the ability to create multi-target configuration files. This means it's possible to create a single configuration file that can be transformed based on relatively simple replacement rules using a Visual Studio and WebDeploy provided XSLT syntax. The idea is that you can create a 'master' configuration file and then create customized versions of this master configuration file by applying some relatively simplistic search and replace, add or remove logic to specific elements and attributes in the original file. To give you an idea, here's the example code that Visual Studio creates for a default web.Release.config file, which replaces a connection string, removes the debug attribute and replaces the CustomErrors section: <?xml version="1.0"?> <configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"> <connectionStrings> <add name="MyDB" connectionString="Data Source=ReleaseSQLServer;Initial Catalog=MyReleaseDB;Integrated Security=True" xdt:Transform="SetAttributes" xdt:Locator="Match(name)"/> </connectionStrings> <system.web> <compilation xdt:Transform="RemoveAttributes(debug)" /> <customErrors defaultRedirect="GenericError.htm" mode="RemoteOnly" xdt:Transform="Replace"> <error statusCode="500" redirect="InternalError.htm"/> </customErrors> </system.web> </configuration> You can see the XSL transform syntax that drives this functionality. Basically, only the elements listed in the override file are matched and updated – all the rest of the original web.config file stays intact. Visual Studio 2010 supports this functionality directly in the project system so it's easy to create and maintain these customized configurations in the project tree. Once you're ready to publish your application, you can then use the Publish <yourWebApplication> option on the Build menu which allows publishing to disk, via FTP or to a Web Server using Web Deploy. You can also create a deployment package as a .zip file which can be used by the WebDeploy tool to configure and install the application. You can manually run the Web Deploy tool or use the IIS Manager to install the package on the server or other machine. You can find out more about WebDeploy and Packaging here: http://tinyurl.com/2anxcje. Improved Routing Routing provides a relatively simple way to create clean URLs with ASP.NET by associating a template URL path and routing it to a specific ASP.NET HttpHandler. Microsoft first introduced routing with ASP.NET MVC and then they integrated routing with a basic implementation in the core ASP.NET engine via a separate ASP.NET routing assembly. In ASP.NET 4.0, the process of using routing functionality gets a bit easier. First, routing is now rolled directly into System.Web, so no extra assembly reference is required in your projects to use routing. The RouteCollection class now includes a MapPageRoute() method that makes it easy to route to any ASP.NET Page requests without first having to implement an IRouteHandler implementation. It would have been nice if this could have been extended to serve *any* handler implementation, but unfortunately for anything but a Page derived handlers you still will have to implement a custom IRouteHandler implementation. ASP.NET Pages now include a RouteData collection that will contain route information. Retrieving route data is now a lot easier by simply using this.RouteData.Values["routeKey"] where the routeKey is the value specified in the route template (i.e., "users/{userId}" would use Values["userId"]). The Page class also has a GetRouteUrl() method that you can use to create URLs with route data values rather than hardcoding the URL: <%= this.GetRouteUrl("users",new { userId="ricks" }) %> You can also use the new Expression syntax using <%$RouteUrl %> to accomplish something similar, which can be easier to embed into Page or MVC View code: <a runat="server" href='<%$RouteUrl:RouteName=user, id=ricks %>'>Visit User</a> Finally, the Response object also includes a new RedirectToRoute() method to build a route url for redirection without hardcoding the URL. Response.RedirectToRoute("users", new { userId = "ricks" }); All of these routines are helpers that have been integrated into the core ASP.NET engine to make it easier to create routes and retrieve route data, which hopefully will result in more people taking advantage of routing in ASP.NET. To find out more about the routing improvements you can check out Dan Maharry's blog which has a couple of nice blog entries on this subject: http://tinyurl.com/37trutj and http://tinyurl.com/39tt5w5. Session State Improvements Session state is an often used and abused feature in ASP.NET and version 4.0 introduces a few enhancements geared towards making session state more efficient and to minimize at least some of the ill effects of overuse. The first improvement affects out of process session state, which is typically used in web farm environments or for sites that store application sensitive data that must survive AppDomain restarts (which in my opinion is just about any application). When using OutOfProc session state, ASP.NET serializes all the data in the session statebag into a blob that gets carried over the network and stored either in the State server or SQL Server via the Session provider. Version 4.0 provides some improvement in this serialization of the session data by offering an enableCompression option on the web.Config <Session> section, which forces the serialized session state to be compressed. Depending on the type of data that is being serialized, this compression can reduce the size of the data travelling over the wire by as much as a third. It works best on string data, but can also reduce the size of binary data. In addition, ASP.NET 4.0 now offers a way to programmatically turn session state on or off as part of the request processing queue. In prior versions, the only way to specify whether session state is available is by implementing a marker interface on the HTTP handler implementation. In ASP.NET 4.0, you can now turn session state on and off programmatically via HttpContext.Current.SetSessionStateBehavior() as part of the ASP.NET module pipeline processing as long as it occurs before the AquireRequestState pipeline event. Output Cache Provider Output caching in ASP.NET has been a very useful but potentially memory intensive feature. The default OutputCache mechanism works through in-memory storage that persists generated output based on various lifetime related parameters. While this works well enough for many intended scenarios, it also can quickly cause runaway memory consumption as the cache fills up and serves many variations of pages on your site. ASP.NET 4.0 introduces a provider model for the OutputCache module so it becomes possible to plug-in custom storage strategies for cached pages. One of the goals also appears to be to consolidate some of the different cache storage mechanisms used in .NET in general to a generic Windows AppFabric framework in the future, so various different mechanisms like OutputCache, the non-Page specific ASP.NET cache and possibly even session state eventually can use the same caching engine for storage of persisted data both in memory and out of process scenarios. For developers, the OutputCache provider feature means that you can now extend caching on your own by implementing a custom Cache provider based on the System.Web.Caching.OutputCacheProvider class. You can find more info on creating an Output Cache provider in Gunnar Peipman's blog at: http://tinyurl.com/2vt6g7l. Response.RedirectPermanent ASP.NET 4.0 includes features to issue a permanent redirect that issues as an HTTP 301 Moved Permanently response rather than the standard 302 Redirect respond. In pre-4.0 versions you had to manually create your permanent redirect by setting the Status and Status code properties – Response.RedirectPermanent() makes this operation more obvious and discoverable. There's also a Response.RedirectToRoutePermanent() which provides permanent redirection of route Urls. Preloading of Applications ASP.NET 4.0 provides a new feature to preload ASP.NET applications on startup, which is meant to provide a more consistent startup experience. If your application has a lengthy startup cycle it can appear very slow to serve data to clients while the application is warming up and loading initial resources. So rather than serve these startup requests slowly in ASP.NET 4.0, you can force the application to initialize itself first before even accepting requests for processing. This feature works only on IIS 7.5 (Windows 7 and Windows Server 2008 R2) and works in combination with IIS. You can set up a worker process in IIS 7.5 to always be running, which starts the Application Pool worker process immediately. ASP.NET 4.0 then allows you to specify site-specific settings by setting the serverAutoStartEnabled on a particular site along with an optional serviceAutoStartProvider class that can be used to receive "startup events" when the application starts up. This event in turn can be used to configure the application and optionally pre-load cache data and other information required by the app on startup.  The configuration settings need to be made in applicationhost.config: <sites> <site name="WebApplication2" id="1"> <application path="/" serviceAutoStartEnabled="true" serviceAutoStartProvider="PreWarmup" /> </site> </sites> <serviceAutoStartProviders> <add name="PreWarmup" type="PreWarmupProvider,MyAssembly" /> </serviceAutoStartProviders> Hooking up a warm up provider is optional so you can omit the provider definition and reference. If you do define it here's what it looks like: public class PreWarmupProvider System.Web.Hosting.IProcessHostPreloadClient { public void Preload(string[] parameters) { // initialization for app } } This code fires and while it's running, ASP.NET/IIS will hold requests from hitting the pipeline. So until this code completes the application will not start taking requests. The idea is that you can perform any pre-loading of resources and cache values so that the first request will be ready to perform at optimal performance level without lag. Runtime Performance Improvements According to Microsoft, there have also been a number of invisible performance improvements in the internals of the ASP.NET runtime that should make ASP.NET 4.0 applications run more efficiently and use less resources. These features come without any change requirements in applications and are virtually transparent, except that you get the benefits by updating to ASP.NET 4.0. Summary The core feature set changes are minimal which continues a tradition of small incremental changes to the ASP.NET runtime. ASP.NET has been proven as a solid platform and I'm actually rather happy to see that most of the effort in this release went into stability, performance and usability improvements rather than a massive amount of new features. The new functionality added in 4.0 is minimal but very useful. A lot of people are still running pure .NET 2.0 applications these days and have stayed off of .NET 3.5 for some time now. I think that version 4.0 with its full .NET runtime rev and assembly and configuration consolidation will make an attractive platform for developers to update to. If you're a Web Forms developer in particular, ASP.NET 4.0 includes a host of new features in the Web Forms engine that are significant enough to warrant a quick move to .NET 4.0. I'll cover those changes in my next column. Until then, I suggest you give ASP.NET 4.0 a spin and see for yourself how the new features can help you out. © Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  

    Read the article

  • DevConnections Session Slides, Samples and Links

    - by Rick Strahl
    Finally coming up for air this week, after catching up with being on the road for the better part of three weeks. Here are my slides, samples and links for my four DevConnections Session two weeks ago in Vegas. I ended up doing one extra un-prepared for session on WebAPI and AJAX, as some of the speakers were either delayed or unable to make it at all to Vegas due to Sandy's mayhem. It was pretty hectic in the speaker room as Erik (our event coordinator extrodinaire) was scrambling to fill session slots with speakers :-). Surprisingly it didn't feel like the storm affected attendance drastically though, but I guess it's hard to tell without actual numbers. The conference was a lot of fun - it's been a while since I've been speaking at one of these larger conferences. I'd been taking a hiatus, and I forgot how much I enjoy actually giving talks. Preparing - well not  quite so much, especially since I ended up essentially preparing or completely rewriting for all three of these talks and I was stressing out a bit as I was sick the week before the conference and didn't get as much time to prepare as I wanted to. But - as always seems to be the case - it all worked out, but I guess those that attended have to be the judge of that… It was great to catch up with my speaker friends as well - man I feel out of touch. I got to spend a bunch of time with Dan Wahlin, Ward Bell, Julie Lerman and for about 10 minutes even got to catch up with the ever so busy Michele Bustamante. Lots of great technical discussions including a fun and heated REST controversy with Ward and Howard Dierking. There were also a number of great discussions with attendees, describing how they're using the technologies touched in my talks in live applications. I got some great ideas from some of these and I wish there would have been more opportunities for these kinds of discussions. One thing I miss at these Vegas events though is some sort of coherent event where attendees and speakers get to mingle. These Vegas conferences are just like "go to sessions, then go out and PARTY on the town" - it's Vegas after all! But I think that it's always nice to have at least one evening event where everybody gets to hang out together and trade stories and geek talk. Overall there didn't seem to be much opportunity for that beyond lunch or the small and short exhibit hall events which it seemed not many people actually went to. Anyways, a good time was had. I hope those of you that came to my sessions learned something useful. There were lots of great questions and discussions after the sessions - always appreciate hearing the real life scenarios that people deal with in relation to the abstracted scenarios in sessions. Here are the Session abstracts, a few comments and the links for downloading slides and  samples. It's not quite like being there, but I hope this stuff turns out to be useful to some of you. I'll be following up a couple of these sessions with white papers in the following weeks. Enjoy. ASP.NET Architecture: How ASP.NET Works at the Low Level Abstract:Interested in how ASP.NET works at a low level? ASP.NET is extremely powerful and flexible technology, but it's easy to forget about the core framework that underlies the higher level technologies like ASP.NET MVC, WebForms, WebPages, Web Services that we deal with on a day to day basis. The ASP.NET core drives all the higher level handlers and frameworks layered on top of it and with the core power comes some complexity in the form of a very rich object model that controls the flow of a request through the ASP.NET pipeline from Windows HTTP services down to the application level. To take full advantage of it, it helps to understand the underlying architecture and model. This session discusses the architecture of ASP.NET along with a number of useful tidbits that you can use for building and debugging your ASP.NET applications more efficiently. We look at overall architecture, how requests flow from the IIS (7 and later) Web Server to the ASP.NET runtime into HTTP handlers, modules and filters and finally into high-level handlers like MVC, Web Forms or Web API. Focus of this session is on the low-level aspects on the ASP.NET runtime, with examples that demonstrate the bootstrapping of ASP.NET, threading models, how Application Domains are used, startup bootstrapping, how configuration files are applied and how all of this relates to the applications you write either using low-level tools like HTTP handlers and modules or high-level pages or services sitting at the top of the ASP.NET runtime processing chain. Comments:I was surprised to see so many people show up for this session - especially since it was the last session on the last day and a short 1 hour session to boot. The room was packed and it was to see so many people interested the abstracts of architecture of ASP.NET beyond the immediate high level application needs. Lots of great questions in this talk as well - I only wish this session would have been the full hour 15 minutes as we just a little short of getting through the main material (didn't make it to Filters and Error handling). I haven't done this session in a long time and I had to pretty much re-figure all the system internals having to do with the ASP.NET bootstrapping in light for the changes that came with IIS 7 and later. The last time I did this talk was with IIS6, I guess it's been a while. I love doing this session, mainly because in my mind the core of ASP.NET overall is so cleanly designed to provide maximum flexibility without compromising performance that has clearly stood the test of time in the 10 years or so that .NET has been around. While there are a lot of moving parts, the technology is easy to manage once you understand the core components and the core model hasn't changed much even while the underlying architecture that drives has been almost completely revamped especially with the introduction of IIS 7 and later. Download Samples and Slides   Introduction to using jQuery with ASP.NET Abstract:In this session you'll learn how to take advantage of jQuery in your ASP.NET applications. Starting with an overview of jQuery client features via many short and fun examples, you'll find out about core features like the power of selectors for document element selection, manipulating these elements with jQuery's wrapped set methods in a browser independent way, how to hook up and handle events easily and generally apply concepts of unobtrusive JavaScript principles to client scripting. The second half of the session then delves into jQuery's AJAX features and several different ways how you can interact with ASP.NET on the server. You'll see examples of using ASP.NET MVC for serving HTML and JSON AJAX content, as well as using the new ASP.NET Web API to serve JSON and hypermedia content. You'll also see examples of client side templating/databinding with Handlebars and Knockout. Comments:This session was in a monster of a room and to my surprise it was nearly packed, given that this was a 100 level session. I can see that it's a good idea to continue to do intro sessions to jQuery as there appeared to be quite a number of folks who had not worked much with jQuery yet and who most likely could greatly benefit from using it. Seemed seemed to me the session got more than a few people excited to going if they hadn't yet :-).  Anyway I just love doing this session because it's mostly live coding and highly interactive - not many sessions that I can build things up from scratch and iterate on in an hour. jQuery makes that easy though. Resources: Slides and Code Samples Introduction to jQuery White Paper Introduction to ASP.NET Web API   Hosting the Razor Scripting Engine in Your Own Applications Abstract:The Razor Engine used in ASP.NET MVC and ASP.NET Web Pages is a free-standing scripting engine that can be disassociated from these Web-specific implementations and can be used in your own applications. Razor allows for a powerful mix of code and text rendering that makes it a wonderful tool for any sort of text generation, from creating HTML output in non-Web applications, to rendering mail merge-like functionality, to code generation for developer tools and even as a plug-in scripting engine. In this session, we'll look at the components that make up the Razor engine and how you can bootstrap it in your own applications to hook up templating. You'll find out how to create custom templates and manage Razor requests that can be pre-compiled, detecting page changes and act in ways similar to a full runtime. We look at ways that you can pass data into the engine and retrieve both the rendered output as well as result values in a package that makes it easy to plug Razor into your own applications. Comments:That this session was picked was a bit of a surprise to me, since it's a bit of a niche topic. Even more of a surprise was that during the session quite a few people who attended had actually used Razor externally and were there to find out more about how the process works and how to extend it. In the session I talk a bit about a custom Razor hosting implementation (Westwind.RazorHosting) and drilled into the various components required to build a custom Razor Hosting engine and a runtime around it. This sessions was a bit of a chore to prepare for as there are lots of technical implementation details that needed to be dealt with and squeezing that into an hour 15 is a bit tight (and that aren't addressed even by some of the wrapper libraries that exist). Found out though that there's quite a bit of interest in using a templating engine outside of web applications, or often side by side with the HTML output generated by frameworks like MVC or WebForms. An extra fun part of this session was that this was my first session and when I went to set up I realized I forgot my mini-DVI to VGA adapter cable to plug into the projector in my room - 6 minutes before the session was about to start. So I ended up sprinting the half a mile + back to my room - and back at a full sprint. I managed to be back only a couple of minutes late, but when I started I was out of breath for the first 10 minutes or so, while trying to talk. Musta sounded a bit funny as I was trying to not gasp too much :-) Resources: Slides and Code Samples Westwind.RazorHosting GitHub Project Original RazorHosting Blog Post   Introduction to ASP.NET Web API for AJAX Applications Abstract:WebAPI provides a new framework for creating REST based APIs, but it can also act as a backend to typical AJAX operations. This session covers the core features of Web API as it relates to typical AJAX application development. We’ll cover content-negotiation, routing and a variety of output generation options as well as managing data updates from the client in the context of a small Single Page Application style Web app. Finally we’ll look at some of the extensibility features in WebAPI to customize and extend Web API in a number and useful useful ways. Comments:This session was a fill in for session slots not filled due MIA speakers stranded by Sandy. I had samples from my previous Web API article so decided to go ahead and put together a session from it. Given that I spent only a couple of hours preparing and putting slides together I was glad it turned out as it did - kind of just ran itself by way of the examples I guess as well as nice audience interactions and questions. Lots of interest - and also some confusion about when Web API makes sense. Both this session and the jQuery session ended up getting a ton of questions about when to use Web API vs. MVC, whether it would make sense to switch to Web API for all AJAX backend work etc. In my opinion there's no need to jump to Web API for existing applications that already have a good AJAX foundation. Web API is awesome for real externally consumed APIs and clearly defined application AJAX APIs. For typical application level AJAX calls, it's still a good idea, but ASP.NET MVC can serve most if not all of that functionality just as well. There's no need to abandon MVC (or even ASP.NET AJAX or third party AJAX backends) just to move to Web API. For new projects Web API probably makes good sense for isolation of AJAX calls, but it really depends on how the application is set up. In some cases sharing business logic between the HTML and AJAX interfaces with a single MVC API can be cleaner than creating two completely separate code paths to serve essentially the same business logic. Resources: Slides and Code Samples Sample Code on GitHub Introduction to ASP.NET Web API White Paper© Rick Strahl, West Wind Technologies, 2005-2012Posted in Conferences  ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Terminal Server Licensing with citrix?

    - by Data-Base
    We are installing Citrix XenApp 6. Our plan is to have a "citrix control" server to serve as license server and print server, and 2 citrix servers with Microsoft Terminal Sever service installed. Now, my question: the 2 terminal servers are asking for serial and activation, which is OK, but can we install the Terminal Server Licensing service on "citrix control" server so that the 2 terminal servers will use the licenses from the "citrix control" server?

    Read the article

  • Best City in Australia for Dedicated Hosting [closed]

    - by Brian Stinson
    We are looking to duplicate a copy of our database and application servers to serve our customers in Australia. We are looking for a well connected datacenter providing dedicated hosting (full machine rental) to take database updates and the like from our main site in Boston, MA. Which general location/city in Australia is best connected? East Coast? West Coast? If you have individual datacenter recommendations those are helpful as well.

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >