Search Results

Search found 7128 results on 286 pages for 'httpcontext cache'.

Page 152/286 | < Previous Page | 148 149 150 151 152 153 154 155 156 157 158 159  | Next Page >

  • asp.net MVC - how to get complete local and global resources

    - by Buthrakaur
    I'm localizing application and need to provide JSON representation of local and global resources for JS part of application for all views. My current idea is I'd implement HtmlHelper extension methods like GetLocalResourcesJSON/GetGlobalResourcesJSON which should encode all resource keys+values and return them JSON encoded as string (I'd implement caching as well). At the moment I'm able to retrieve single specific key from global or local resource belonging to current view (using httpContext.GetGlobalResourceObject/GetLocalResourceObject), but I'm not able to find out how to retrieve whole resource object and iterate all its keys+values. Is there any method how to achieve this? it looks like ResourceProviderFactory could be the the key to this problem, but it's not accessible publicly anywhere. I could instantiate ResourceExpressionBuilder and use reflection to retrieve the provider using GetLocal/GlobalResourceProvider() methods, but I don't like using reflection here at all...

    Read the article

  • Logging extra fields with Elmah Error Table?

    - by VJ
    I want to add my own session variable to elmah error log table and display it. I already have modified the source code and added the new fields to Error.cs and other fields but I don't know but when I assign an HttpContext.Current.Session["MyVar"].tostring() value to my field in the constructor it stops logging exceptions and does not log any exception. I just need to get the value of the session variable is there other way for this. I read a post which he added fields for the email but it does not say where exactly should I get the session value.

    Read the article

  • Redirect from https://mydomain.com to http://mydomain.com

    - by Charlie
    Many of my visitors have bookmarked my site already wtih https://mydomain.com. Under the bad advice of a programmer, I put my whole Joomla site under ssl. I do not sell anything or provide any member services. I asked him many times if it would slow my site down - he said it wouldn't. I knew it did, I've researched on this site and realized it does slow the site down because of no cache of the pages. Understood. Please, someone tell me how to get away from it now. I'm not sure how to approach this, should I add something to my htaccess or my main index.php file? I've looked all over the net, there is much advice for adding redirectives for going from http to https, but very few answers regarding the opposite of going from https to http. Thank you very much for your time. I appreciate it.

    Read the article

  • Deploying web service

    - by baron
    I am trying to build a webservice that manipulates http requests POST and GET. Here is a sample: public class CodebookHttpHandler: IHttpHandler { public void ProcessRequest(HttpContext context) { if (context.Request.HttpMethod == "POST") { //DoHttpPostLogic(); } else if (context.Request.HttpMethod == "GET") { //DoHttpGetLogic(); } } ... public void DoHttpPostLogic() { ... } public void DoHttpGetLogic() { ... } I need to deploy this but I am struggling how to start. Most online references show making a website, but really, all I want to do is respond when an HttpPost is sent. I don't know what to put in the website, just want that code to run. Some links i've tried so far: http://my.execpc.com/~gopalan/dotnet/webservices/webservice_server.html http://www.beansoftware.com/asp.net-tutorials/deploy-asp.net.aspx http://msdn.microsoft.com/en-us/library/6x71sze4%28VS.80%29.aspx http://www.c-sharpcorner.com/UploadFile/rajaduraip/SimplestwaytoCreateNDeployWebServices12232005054219AM/SimplestwaytoCreateNDeployWebServices.aspx

    Read the article

  • apt changelog for to-be installed packages

    - by ithkuil
    GUI update-manager is able to show the "changelog" of packages to-be installed (not downloaded yet). I also found out how to provide the .changelog files in the right place for update-manager to show them, and now I'm happy since I'm able to tell my clients that they can see changelogs of my custom packages directly from their gui. Unfortunately I'm not able to find any command line tool to do the same thing and that would be more useful on servers. From what I saw it seems that this convention (putting .changelog files directly alongside the .deb files in the apt repo) is a ubuntu specific extension. There are some debian resources (the reprepro man page for example) which point on a different way to store changelogs online, http://packages.debian.org/changelogs Does anybody know if there already exists a tool like apt-cache to show the changelogs from packages which are not yet installed (nor downloaded) ?

    Read the article

  • apt-get update bzip2 errors

    - by Tejas Kale
    I installed Ubuntu 11.10 today on my Lenovo w500. After that when i tried running sudo apt-get update This is the error i am getting. Get:117 http://ftp.jaist.ac.jp oneiric-security/universe TranslationIndex [73 B] 99% [48 Sources bzip2 0 B] [22 Sources bzip2 5,294 kB] 1,983 kB/s 0s bzip2: Compressed file ends unexpectedly; perhaps it is corrupted? *Possible* reason follows. bzip2: Inappropriate ioctl for device Input file = (stdin), output file = (stdout) It is possible that the compressed file(s) have become corrupted. You can use the -tvv option to test integrity of such files. You can use the `bzip2recover' program to attempt to recover data from undamaged sections of corrupted files. I found the following similar question : Errors while updating Ubuntu 11.10 , But the solutions mentioned ( changing the download server, running apt-get clean, apt-get autoclean) and have also tried removing the /var/cache/apt/archives/lists direcotry. As a result of this, I am unable to install any new packages.

    Read the article

  • Smart Grid Gateway and New Meter Data Management released

    - by Anthony Shorten
    Two products have just been released and are available from edlivery.oracle.com. Smart Grid Gateway 2.0.0 - A new product to integrate to Smart Grid networks Meter Data Management 2.0.1 - A new version of the Meter Data Management product. These products are the first products to use the brand new version of the Oracle Utilities Applicaton Framework (V4.1). The new framework builds up on FW2.2 and FW4.0.2 to add exciting new features (this is just a subset): Support for Database Vault Enhancements to Business Object Maintenance Batch Statistics Portal for benchmarking Custom template user exit support File permissions now consistent with other Oracle products Use of Universal Connection Pool for all database pool access Ability to manage the batch data cache Over the next few weeks I will be publishing articles and updates to existing whitepapers to highlight all the new features.

    Read the article

  • Learn About HTML5 and the Future of the Web

    Learn About HTML5 and the Future of the Web San Francisco Java, PHP, and HTML5 user groups hosted an event on May 11th, 2010 on HTML5 with three amazing speakers: Brad Neuberg from Google, Giorgio Sardo from Microsoft, and Peter Lubbers from Kaazing. In this first of the three videos, Brad Neuberg from Google (formerly an HTML5 advocate and currently a Software Engineer on the Google Buzz team) explains why HTML5 matters - to consumers as well as developers! His overview of HTML5 included SVG/Canvas rendering, CSS transforms, app-cache, local databases, web workers, and much more. He also identified the scope and practical implications of the changes that are coming along with HTML5 support in modern browsers. This event was organized by Marakana, Michael Tougeron from Gamespot, and Bruno Terkaly from Microsoft. Microsoft was the host and Marakana, Gamespot, Medallia, TEKsystems, and Guidewire Software sponsored the event. marakana.com From: GoogleDevelopers Views: 177 6 ratings Time: 50:44 More in Science & Technology

    Read the article

  • How do I make Powertop changes permanent?

    - by arno
    I'm on a Compaq 615 and it's fan is loud as hell. Not much you can do about that but I'm trying to keep the CPU/GPU as cool as possible. This is what Powertop has to say: If I change all of them to "good", the changes don't survive a reboot. Also, upon exiting Powertop I get this: Loaded 8 prior measurements Cannot load from file /var/cache/powertop/saved_parameters.powertop Leaving PowerTOP I added the line to the "grub"-file as suggested here Upon closing gedit I get this: (gedit:2728): Gtk-WARNING **: Attempting to store changes into `/root/.local/share/recently-used.xbel', but failed: Datei »/root/.local/share/recently-used.xbel.9CIMAW« konnte nicht angelegt werden: Datei oder Verzeichnis nicht gefunden The part in German says: Couldn't be created: File or directory not found.

    Read the article

  • SQL ConnectionString in global.asax overridden by web.config

    - by rlb.usa
    This is going to sound very odd, but I have a web.config like this: <connectionStrings> <remove name="LocalSqlServer"/> <add name="LocalSqlServer" connectionString="Data Source=BACKUPDB;..." providerName="System.Data.SqlClient"/> </connectionStrings> And a global.asax like this: void Session_Start(object sender, EventArgs e) { // Code that runs when a new session is started if (Application["con"] == null || Application["con"] == "") { Application["con"] = "Data Source=PRODUCTIONDB;..."; } } And EVERYWHERE in my code, I reference my ConnectionStrings like this: SqlConnection con = new SqlConnection(Convert.ToString(HttpContext.Current.Application["con"])); However, I see that everything I do inside this application goes to BACKUP db instead of PRODUCTIONDB. What is going on, how could this happen, and why? It doesn't make any sense to me, and it got me into a lot of trouble. We use LocalSqlServer string for FormsAuthentication.

    Read the article

  • Looking for literature about graphics pipeline optimization

    - by zacharmarz
    I am looking for some books, articles or tutorials about graphics architecture and graphics pipeline optimizations. It shouldn't be too old (2008 or newer) - the newer, the better. I have found something in [Optimising the Graphics Pipeline, NVIDIA, Koji Ashida] - too old, [Real-time rendering, Akenine Moller], [OpenGL Bindless Extensions, NVIDIA, Jeff Bolz], [Efficient multifragment effects on graphics processing units, Louis Frederic Bavoil] and some internet discussions. But there is not too much information and I want to read more. It should contain something about application, driver, memory and shader units communication and data transfers. About vertices and attributes. Also pre and post T&L cache (if they still exist in nowadays architectures) etc. I don't need anything about textures, frame buffers and rasterization. It can also be about OpenGL (not about DirecX) and optimizing extensions (not old extensions like VBOs, but newer like vertex_buffer_unified_memory).

    Read the article

  • asp.net root paths

    - by dejavu
    I am getting the exception when trying to save a file: System.Web.HttpException: The SaveAs method is configured to require a rooted path, and the path '~/Thumbs/TestDoc2//small/ImageExtractStream.bmp' is not rooted. at System.Web.HttpPostedFile.SaveAs(String filename) at System.Web.HttpPostedFileWrapper.SaveAs(String filename) at PitchPortal.Core.Extensions.ThumbExtensions.SaveSmallThumb(Thumb image) in C:\Users\Bich Vu\Documents\Visual Studio 2008\Projects\PitchPortal\PitchPortal.Core\Extensions\ThumbExenstions.cs:line 23 the code is below: public static void SaveSmallThumb(this Thumb image) { var logger = Microsoft.Practices.ServiceLocation.ServiceLocator.Current.GetInstance<ILoggingService>(); string savedFileName = HttpContext.Current.Server.MapPath(Path.Combine( image.SmallThumbFolderPath, Path.GetFileName(image.PostedFile.FileName))); try { image.PostedFile.SaveAs(savedFileName); } catch (Exception ex) { logger.Log(ex.ToString()); } } I cant see what is wrong here, any tips?

    Read the article

  • Custom fine-grained claims based authorization system in ASP.NET MVC - wheres and hows

    - by BuzzBubba
    So, I'd like to implement my own custom authorization system in MVC2. If I'd have to create a global class, where do I instantiate it? Can HttpContext be extended with my own additions and where do I do that? Should I use Authorization filters for rights validation or ActionFilters or do it within an action? Can ActionFilter pass any data to the action itself? Previously (in WebForms) I was using a Session object where I would put a serialized object containing essential user data (account id and a list of roles and rights) and I'd extend my own Page class.

    Read the article

  • Separation of concerns and authentication

    - by Tom Gilder
    I'm trying to be a Good Developer and separate my concerns out. I've got an ASP.NET MVC project with all my web code, and a DAL project with all the model code. Sometimes code in the DAL needs to check if the current user is authorized to perform some actions, by checking something like CurrentUser.IsAdmin. For the web site, the current is derived from the Windows username (from HttpContext.Current.User.Identity), but this is clearly a web concern and shouldn't be coupled to the DAL. What's the best pattern to loosely couple the authentication? Should the DAL be asking the MVC code for a username, or the MVC be telling the DAL? Are there advantages or disadvantages to one or the other? Thank you!

    Read the article

  • Does an onboard video affect the X windows configuration?

    - by Timothy
    Does the onboard video on the motherboard affect the X windows configuration? My system has onboard and pcie video. The onboard video is a NVIDIA GeForce 7025 GPU, On Board Graphic Max. Memory Share Up to 512MB(Under OS By Turbo Cache). I have a pcie dual head video card installed with two monitors. The video card is a GeForce 8400 GS, with 512mb memory. When installing Ubuntu 12.04, only one monitor worked. When pulling up system settings- Displays it shows a laptop. This is a desktop pc. I did get both monitors to work using nvidia using twinview -- A complicated process! When checking nvidia now it shows the monitors disabled. The Nvidia X server setting does show the GPU and all the information. I was thinking it's seeing the onboard video on the motherboard. Why else would it show laptop?

    Read the article

  • Is Dell Inspiron 15R Special Edition compatible with Ubuntu?

    - by Obada Talal Abu Arisheh
    I want to buy a Dell Inspiron 15R Special Edition. On ubuntu.com, it says that Dell Inspiron 15R will work properly. But the special edition has some special issues. I will list the hardware: 3rd Generation Intel® Core™ i7-3612QM processor (6M Cache, up to 3.1 GHz) 15.6" Full High Definition (1080p) LED Display 8GB2 Dual Channel DDR3 SDRAM at 1600MHz 750GB 7200 RPM SATA Hard Drive 8X Tray Load CD/DVD Burner (Dual Layer DVD+/-R Drive) AMD Radeon™ HD 7730M 2GB Built-in Skullcandy™ stereo speakers and Waves MaxxAudio® 4 technology Will it have any problem?

    Read the article

  • Swap and hibernation

    - by maaartinus
    I saw a lot of recommendations claiming that for hibernation the swap partition/file must be at least as large as the main memory. This makes no sense to me. Lets assume I have 8 GB of main memory and 8 GB swap area and want to hibernate: case 1: I'm using 4 GB of virtual memory - 8 GB of swap is unnecessarily large. case 2: I'm using 8 GB of virtual memory - 8 GB of swap is just right. case 3: I'm using 12 GB of virtual memory - 8 GB of swap is too small. The outcome is: A swap area of size equal to the memory size is sufficient for hibernate IFF it doesn't get used for swapping at all. So what is the reason behind the claim that you need at least as much swap area as main memory for hibernate to work? I know that virtual memory gets used for caching too, and that the cache may be simply discarded, but what happens to hibernation if a program allocates 12 GB of virtual memory (given the above memory and swap sizes)?

    Read the article

  • Share on Facebook does not show thumbnail images

    - by matt_tm
    I have a PHP application which has a "Share on Facebook" button that, On the development server shows the thumbnail images correctly and allows the user to select between them On the live server, it does NOT show the thumbnail images at all. The relevant portion of the .htaccess file is: # Set up caching on media files for 2 days <FilesMatch "\.(gif|jpg|jpeg|png|flv)$"> ExpiresDefault A172800 Header append Cache-Control "public" </FilesMatch> I'm using the exact same set of php files and .htaccess, but the server configuration is different. What could be causing this? Note that the text appears fine. Edit1 We are also doing some URL rewriting related to images in the .htaccess (on both servers): ... RewriteRule ^.*/content/image/(.*)$ content/image/$1 [L] ... RewriteRule ^.*/images/(.*)$ images/$1 [L] ... Would that be somehow making a difference? Images appear fine all throughout the site. (I posted this question earlier as http://stackoverflow.com/questions/4142597/share-on-facebook-does-not-show-thumbnail-images) )

    Read the article

  • How to receive userinfo with google adwords api libraries

    - by PatrickvKleef
    I'm using the Google Adwords API libraries and I would like to receive the userinfo of the logged in user. I added the userinfo scope as followed: googleAdwordsUser = new AdWordsUser(); string oauth_callback_url = HttpContext.Current.Request.Url.GetLeftPart(UriPartial.Path); googleAdwordsUser.OAuthProvider = new AdsOAuthNetProvider("https://adwords-sandbox.google.com/api/adwords/ https://www.googleapis.com/auth/userinfo.email", oauth_callback_url, Session.SessionID); When the callback url is called, I'm trying to get the users emailaddress, but it isn't working, the error 'The remote server returned an error: (401) Unauthorized.' is thrown. string url = @"https://www.googleapis.com/oauth2/v2/userinfo?access_token=" + token; HttpWebRequest objRequest = (HttpWebRequest)WebRequest.Create(url); objRequest.Method = "GET"; HttpWebResponse objResponse = (HttpWebResponse)objRequest.GetResponse(); string result = string.Empty; using (StreamReader sr = new StreamReader(objResponse.GetResponseStream())) { result = sr.ReadToEnd(); } Does somebody knows how to fix this? Thanks.

    Read the article

  • How do I install the btrfs-restore utility on 12.04?

    - by MountainX
    I would like to install btrfs-restore on Kubuntu 12.04 apt-cache search btrfs-restore returns nothing. Google "ubuntu download OR install btrfs-restore" returns nothing useful. Also, where do I get btrfs help? I'm not getting any replies on #btrfs on freenode.net. (Correction: I was too impatient. #btrfs was very helpful!) UPDATE: the previously accepted answer no longer works, so I unselected it as the answer. The PPA dmitrij.ledkov/ppa is missing now. Thanks to Pkunk at #btrfs, I posted a new solution below.

    Read the article

  • External USB 3 drive not recognized

    - by ilan123
    Ubuntu 12.10 64 bit seems not to recognize my external hard disk. It is a Vantec NST-310S3 external disk enclosure with a WD 3TB drive. The disk has two NTFS partitions. My PC is a dual boot system. Under Windows 7 the hard disk works fine but I can't make it work with Ubuntu. When the drive is connected to the PC then the command sudo fdisk -l seems to hang forever. Below are the output of lsusb and cat /proc/partitions without the external drive and then with it connected. I added also the last lines of the dmesg command at the end. First without the drive: ilan@linux:~$ lsusb Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 001 Device 003: ID 13ba:0017 Unknown PS/2 Keyboard+Mouse Adapter Bus 001 Device 004: ID 046d:c50e Logitech, Inc. Cordless Mouse Receiver Bus 001 Device 005: ID 0ac8:3420 Z-Star Microelectronics Corp. Venus USB2.0 Camera ilan@linux:~$ cat /proc/partitions major minor #blocks name 8 0 1953514584 sda 8 1 102400 sda1 8 2 629043200 sda2 8 3 367001600 sda3 8 4 1 sda4 8 5 471859200 sda5 8 6 157286400 sda6 8 7 324115456 sda7 8 8 4101120 sda8 11 0 1048575 sr0 Second with the USB 3 drive: ilan@linux:~$ lsusb Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 004 Device 002: ID 174c:55aa ASMedia Technology Inc. Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 001 Device 003: ID 13ba:0017 Unknown PS/2 Keyboard+Mouse Adapter Bus 001 Device 004: ID 046d:c50e Logitech, Inc. Cordless Mouse Receiver Bus 001 Device 005: ID 0ac8:3420 Z-Star Microelectronics Corp. Venus USB2.0 Camera ilan@linux:~$ cat /proc/partitions major minor #blocks name 8 0 1953514584 sda 8 1 102400 sda1 8 2 629043200 sda2 8 3 367001600 sda3 8 4 1 sda4 8 5 471859200 sda5 8 6 157286400 sda6 8 7 324115456 sda7 8 8 4101120 sda8 11 0 1048575 sr0 8 16 2930266584 sdb ilan@linux:~$ lsusb -v -s 004:002 Bus 004 Device 002: ID 174c:55aa ASMedia Technology Inc. Couldn't open device, some information will be missing Device Descriptor: bLength 18 bDescriptorType 1 bcdUSB 3.00 bDeviceClass 0 (Defined at Interface level) bDeviceSubClass 0 bDeviceProtocol 0 bMaxPacketSize0 9 idVendor 0x174c ASMedia Technology Inc. idProduct 0x55aa bcdDevice 1.00 iManufacturer 2 iProduct 3 iSerial 1 bNumConfigurations 1 Configuration Descriptor: bLength 9 bDescriptorType 2 wTotalLength 44 bNumInterfaces 1 bConfigurationValue 1 iConfiguration 0 bmAttributes 0xc0 Self Powered MaxPower 0mA Interface Descriptor: bLength 9 bDescriptorType 4 bInterfaceNumber 0 bAlternateSetting 0 bNumEndpoints 2 bInterfaceClass 8 Mass Storage bInterfaceSubClass 6 SCSI bInterfaceProtocol 80 Bulk-Only iInterface 0 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x81 EP 1 IN bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0400 1x 1024 bytes bInterval 0 bMaxBurst 15 Endpoint Descriptor: bLength 7 bDescriptorType 5 bEndpointAddress 0x02 EP 2 OUT bmAttributes 2 Transfer Type Bulk Synch Type None Usage Type Data wMaxPacketSize 0x0400 1x 1024 bytes bInterval 0 bMaxBurst 15 ilan@linux:~$ sudo fdisk -l [sudo] password for ilan: Disk /dev/sda: 2000.4 GB, 2000398934016 bytes 255 heads, 63 sectors/track, 243201 cylinders, total 3907029168 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xf1b4f1ee Device Boot Start End Blocks Id System /dev/sda1 * 2048 206847 102400 7 HPFS/NTFS/exFAT /dev/sda2 206848 1258293247 629043200 7 HPFS/NTFS/exFAT /dev/sda3 1258293248 1992296447 367001600 7 HPFS/NTFS/exFAT /dev/sda4 1992298494 3907028991 957365249 f W95 Ext'd (LBA) /dev/sda5 1992298496 2936016895 471859200 7 HPFS/NTFS/exFAT /dev/sda6 2936018944 3250591743 157286400 7 HPFS/NTFS/exFAT /dev/sda7 3250593792 3898824703 324115456 83 Linux /dev/sda8 3898826752 3907028991 4101120 82 Linux swap / Solaris dmesg output after connecting the external drive: [ 23.740567] e1000e: eth0 NIC Link is Up 1000 Mbps Full Duplex, Flow Control: Rx/Tx [ 23.740786] IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready [ 49.144673] usb 4-1: >new SuperSpeed USB device number 2 using xhci_hcd [ 49.163039] usb 4-1: >Parent hub missing LPM exit latency info. Power management will be impacted. [ 49.166789] usb 4-1: >New USB device found, idVendor=174c, idProduct=55aa [ 49.166793] usb 4-1: >New USB device strings: Mfr=2, Product=3, SerialNumber=1 [ 49.166796] usb 4-1: >Product: AS2105 [ 49.166799] usb 4-1: >Manufacturer: ASMedia [ 49.166801] usb 4-1: >SerialNumber: 0123456789ABCDEF [ 49.206372] usbcore: registered new interface driver uas [ 49.228891] Initializing USB Mass Storage driver... [ 49.229042] scsi6 : usb-storage 4-1:1.0 [ 49.229115] usbcore: registered new interface driver usb-storage [ 49.229116] USB Mass Storage support registered. [ 64.045528] scsi 6:0:0:0: >Direct-Access WDC WD30 EZRX-00MMMB0 80.0 PQ: 0 ANSI: 0 [ 64.046224] sd 6:0:0:0: >Attached scsi generic sg2 type 0 [ 64.046881] sd 6:0:0:0: >[sdb] Very big device. Trying to use READ CAPACITY(16). [ 64.047610] sd 6:0:0:0: >[sdb] 5860533168 512-byte logical blocks: (3.00 TB/2.72 TiB) [ 64.048368] sd 6:0:0:0: >[sdb] Write Protect is off [ 64.048373] sd 6:0:0:0: >[sdb] Mode Sense: 23 00 00 00 [ 64.048984] sd 6:0:0:0: >[sdb] No Caching mode page present [ 64.048987] sd 6:0:0:0: >[sdb] Assuming drive cache: write through [ 64.049297] sd 6:0:0:0: >[sdb] Very big device. Trying to use READ CAPACITY(16). [ 64.050942] sd 6:0:0:0: >[sdb] No Caching mode page present [ 64.050944] sd 6:0:0:0: >[sdb] Assuming drive cache: write through [ 94.245006] usb 4-1: >reset SuperSpeed USB device number 2 using xhci_hcd [ 94.262553] usb 4-1: >Parent hub missing LPM exit latency info. Power management will be impacted. [ 94.263805] xhci_hcd 0000:03:00.0: >xHCI xhci_drop_endpoint called with disabled ep ffff8800d37d1c00 [ 94.263808] xhci_hcd 0000:03:00.0: >xHCI xhci_drop_endpoint called with disabled ep ffff8800d37d1c40 [ 125.262722] usb 4-1: >reset SuperSpeed USB device number 2 using xhci_hcd [ 125.280304] usb 4-1: >Parent hub missing LPM exit latency info. Power management will be impacted. [ 125.281511] xhci_hcd 0000:03:00.0: >xHCI xhci_drop_endpoint called with disabled ep ffff8800d37d1c00 [ 125.281516] xhci_hcd 0000:03:00.0: >xHCI xhci_drop_endpoint called with disabled ep ffff8800d37d1c40

    Read the article

  • Getting a scriptmanager into a dynamically rendered page

    - by AndreasKnudsen
    Hi, We are rendering usercontrols dynamically like this: public string RenderControl(string pathcontrol) { string html; var page = new Page(); var control = page.LoadControl(path); page.Controls.Add(control); // do stuff to the control (give it some data to work on) using (var writer = new StringWriter()) { HttpContext.Current.Server.Execute(page, writer, false); html = writer.ToString(); } return html; } This lets us the same user controls when rendering pages normally as we do when rendering responses to ajax calls. However, when adding controls which themselves contain a scriptmanagerProxy we run into the problem that the newed up Page object doesn't contain either a ScriptManager or the HtmlForm in which the ScriptManager needs to run. Is there any way around this? Yours Andreas

    Read the article

  • SQL ConnectionString in web.config and global.asax implications

    - by rlb.usa
    This is going to sound very odd, but I have a web.config like this: <connectionStrings> <remove name="LocalSqlServer"/> <add name="LocalSqlServer" connectionString="Data Source=BACKUPDB;..." providerName="System.Data.SqlClient"/> </connectionStrings> And a global.asax like this: void Session_Start(object sender, EventArgs e) { // Code that runs when a new session is started if (Application["con"] == null || Application["con"] == "") { Application["con"] = "Data Source=PRODUCTIONDB;..."; } } And EVERYWHERE in my code, I reference my ConnectionStrings like this: SqlConnection con = new SqlConnection(Convert.ToString(HttpContext.Current.Application["con"])); However, I see that everything I do inside this application goes to BACKUP db instead of PRODUCTIONDB. What is going on, how could this happen, and why? It doesn't make any sense to me, and it got me into a lot of trouble. We use LocalSqlServer string for FormsAuthentication.

    Read the article

  • Create speed baseline for local web file

    - by Michael Jasper
    Is there any tool or method that will load a localhost page a number of times, and return the averaged data for load times, onload events, Dom ready events, etc? I'd like to work on page speed optimization, but need a baseline before I begin. I have used both Google analytics and Webmaster tools, but I'd like an automated solutions that runs locally. My ideal solution would be a program or script that would take the path/file, number of iterations, and then take several minutes to load the page n times without cache and crunch numbers to create a baseline.

    Read the article

  • How to remove old robots.txt from google as old file block the whole site

    - by KnowledgeSeeker
    I have a website which still shows old robots.txt in the google webmaster tools. User-agent: * Disallow: / Which is blocking Googlebot. I have removed old file updated new robots.txt file with almost full access & uploaded it yesterday but it is still showing me the old version of robots.txt Latest updated copy contents are below User-agent: * Disallow: /flipbook/ Disallow: /SliderImage/ Disallow: /UserControls/ Disallow: /Scripts/ Disallow: /PDF/ Disallow: /dropdown/ I submitted request to remove this file using Google webmaster tools but my request was denied I would appreciate if someone can tell me how i can clear it from the google cache and make google read the latest version of robots.txt file.

    Read the article

< Previous Page | 148 149 150 151 152 153 154 155 156 157 158 159  | Next Page >