Search Results

Search found 42685 results on 1708 pages for 'page speed'.

Page 651/1708 | < Previous Page | 647 648 649 650 651 652 653 654 655 656 657 658  | Next Page >

  • jQuery global variable best practice & options?

    - by Kris Krause
    Currently I am working on a legacy web page that uses a ton of javascript, jquery, microsoft client javascript, and other libraries. The bottom line - I cannot rewrite the entire page from scratch as the business cannot justify it. So... it is what it is. Anyway, I need to pollute (I really tried not too) the global namespace with a variable. There are the three options I was thinking - Just store/retrieve it using a normal javascript declaration - var x = 0; Utilize jQuery to store/retrieve the value in a DOM tag - $("body").data("x", 0); Utilize a hidden form field, and set/retrieve the value with jQuery - $("whatever").data("x", 0); What does everyone think? Is there a better way? I looked at the existing pile of code and I do not believe the variable can be scoped in a function.

    Read the article

  • Computer POST's and draws the BIOS screens very slowly - Motherboard issue?

    - by Ssvarc
    I have a desktop that refuses to boot into Windows. I used Hitachi DFT and the HD came back OK. I then used Memtest86+ and it took hours for the test to run. After 8+ hours it was up to test number 6. I aborted and ran Memtest86. It ran at basically the same speed. I aborted and went to look at the BIOS settings. The computer is running slow at POST. It takes a long time for the keyboard to be recognized, etc. The BIOS settings takes time to be (slowly) drawn on the screen. What could be causing such behavior? EDIT: I gave back the computer a while back without ever discovering the cause so I'm closing the question.

    Read the article

  • Please help properly setting up path variables for root.php

    - by Joel
    Hi guys, I just posted a similar question, but deleted it because I realized I was working with an old file...doh! I am just trying to get my XAMPP setup working for me. I have a live site that navigates to a login page at http://www.monkeycalendar.com/arvindkt/login.php That login page includes a root.php file that is found at http://www.monkeycalendar.com/arvindkt/root.php Live site works great. My localhost is set up so my sites are a folder in localhost: IE: http://www.example.com = localhost/example.com I'm having problems figuring out how to make my root folder point to the right directory. Any help would be much appreciated: root.php: # local settings define("SITE_ROOT" , $_SERVER['DOCUMENT_ROOT']."/arvindkt"); define("SITE_URL" , "http://localhost/monkeycalendar.com"); define('DB_HOST', "localhost"); define('DB_USER', "root"); define('DB_PASS', ""); define('DB_NAME', "dev.monkeycalendar");

    Read the article

  • Connect two subnets without router

    - by Shcheklein
    I got two Comcast routers with two different subnets on each. Every subnet contains 5 static IPs. Two questions: Are there any problems if both routers and machines from both subnets are connected into one switch? Security issues doesn't matter there. I need to know if there are some performance or other problems. Is it possible to make machines from different subnets to see each other if they all are connected into one switch? Some static routing, add ARP records or somethig else ... I just want to avoid configuring second ethernet adaptors, third router or something. And I need to connect these subnets vai high-speed local network.

    Read the article

  • Strange TFS Source Repository Problem

    - by Brian
    We have a web project we are working on using TFS and we are kind of new to it (TFS). One of my teammates is unable to see a particular page (three associated files) in the IDE. To the rest of us, it looks as though it is checked out to her. When she ran the unlock command through the console, it returned that the files for the page were not locked. Yet we are unable to check it out due to her apparently having a lock. Any thoughts, ideas, or suggestions would be greatly appreciated!

    Read the article

  • Which is better for running Ubuntu and other Linux OSes, Chromebook or Windows, why? [on hold]

    - by Serge
    I'm learning programming and I would like to switch to a Linux OS, perhaps Ubuntu, to continue this, but the current machine is generally getting pretty old and slow and Windows is the least favorite option for production, and I can manage getting something new right around the price range of the nicest Chromebook on the market right now. However, I have compared specs of HP Chromebook 14 with those of regular PC laptops that roughly cost the same, and the latter consistently have approximately the same and sometimes higher (like the processor speed) specs. Yet usage of Chromebooks for this purpose is pretty widespread nowadays. Is this because they were initially built for a Linux OS - and is it really THAT crucial - or are there other major factors at play here?

    Read the article

  • ServerAlias www.example.com is not recognized

    - by Tianzhou Chen
    Below is my config file: NameVirtualHost 12.34.56.78:80 < VirtualHost 12.34.56.78:80 ServerAdmin [email protected] ServerName domain1.com ServerAlias www.domain1.com DocumentRoot /srv/www/domain1.com/public_html1/ ErrorLog /srv/www/domain1.com/logs/error.log CustomLog /srv/www/domain1.com/logs/access.log combined < /VirtualHost < VirtualHost 12.34.56.78:80 ServerAdmin [email protected] ServerName domain2.com ServerAlias www.domain2.com DocumentRoot /srv/www/domain2.com/public_html1/ ErrorLog /srv/www/domain2.com/logs/error.log CustomLog /srv/www/domain2.com/logs/access.log combined < /VirtualHost The thing is when I put www.domain1.com into browser, apache2 doesn't retrieve the web page resides in /srv/www/domain1.com/public_html1/, instead, it gets the page from the default document root defined in another file. However, if I put www.domain2.com, everything works fine. I don't see any difference between two VirtualHost config block, so I wonder what does make the difference. BTW, I haven't put any .htaccess file under their document root. Thanks for your advice! Tianzhou

    Read the article

  • Scripts on UNC paths take very long to run

    - by Álvaro G. Vicario
    I have several scripts in UNC paths (from Windows batch files to PHP scripts). No matter how I run them (double click on explorer, my editor's run command menu or Windows command prompt) they take really long to start running (like 14 seconds). Once they get started they run normally. This doesn't happen if I run them from mapped drives. I'm using Windows XP Professional SP3 inside an Active Directory domain and files are hosted in a Windows Server box (not sure about the version, it's an HP dedicated file server with bundled OS). Why does it happen? Is there a way to speed up things while using UNC paths?

    Read the article

  • Is it worthwhile to implement observer pattern in PHP?

    - by Extrakun
    I have been meaning to make use of design pattern in PHP, such as the observer pattern, but that I have to recreate the observers' relationship each time the page is loaded pains me. As references are saved as a new concrete objects in session, there is no way to preserve relationships between subscribers and their observers unless you use a GUID or some other properties to form a lookup, and store that property instead. With the cost of recreating the relationships each time a page is loaded, is it worthwhile to use design patterns such as observers in PHP, compared to having a clean design? Any real-world experience to share?

    Read the article

  • Windows 7 Machine Makes Router Drop -All- Wireless Connections [closed]

    - by Hammer Bro.
    Note: I accidentally originally posted this question over at SuperUser, and I still think the issue is caused by some low-level networking practice of Windows 7, but I think the expertise here would be more apt to figuring it out. Apologies for the cross-post. Some background: My home network consists of my Desktop, a two-month old Windows 7 (x64) machine which is online most frequently (N-spec), as well as three other Windows XP laptops (all G) that only connect every now and then (one for work, one for Netflix, and the other for infrequent regular laptop uses). I used to have a Belkin F5D8236-4 wireless router, and everything worked great. A week ago, however, I found out that the Belkin absolutely in no way would establish a VPN connection, something that has become important for work. So I bought a Netgear WNR3500v2/U/L. The wireless was acting a little sketchy at first for just the Windows 7 machine, but I thought it had something to do with 802.11N and I was in a hurry so I just fished up an ethernet cable and disabled the computer's wireless. It has now become apparent, though, that whenever the Windows 7 machine is connected to the router, all wireless connections become unstable. I was using my work laptop for a solid six hours today with no trouble, having multiple SSH connections open over VPN and streaming internet radio in the background. Then, within two minutes of turning on this Windows 7 box, I had lost all connectivity over the wireless. And I was two feet away from the router. The same sort of thing happens on all of the other laptops -- Netflix can be playing stuff all weekend, but if I come up here and do things on this (W7) computer, the streaming will be dead within ten minutes. So here are my basic observations: If the Windows 7 machine is off, then all connections will have a Signal Strength of Very Good or Excellent and a Speed of 48-54 Mbps for an indefinite amount of time. Shortly after the Windows 7 machine is turned on, all wireless connections will experience a consistent decline in Speed down to 1.0 Mbps, eventually losing their connection entirely. These machines will continue to maintain 70% signal strength, as observed by themselves and router. Once dropped, a wireless connection will have difficulty reconnecting. And, if a connection manages to become established, it will quickly drop off again. The Windows 7 machine itself will continue to function just fine if it's using a wired connection, although it will experience these same issues over the wireless. All of the drivers and firmwares are up to date, and this happened both with the stock Netgear firmware as well as the (current) DD-WRT. What I've tried: Making sure each computer is being assigned a distinct IP. (They are.) Disabling UPnP and Stateful Packet Inspection on the router. Disabling Network Sharing, SSDP Discovery, TCP/IP NetBios Helper and Computer Browser services on the Windows 7 machine. Disabling QoS Packet Scheduler, IPv6, and Link Layer Topology Discovery options on my ethernet controller (leaving only Client for Microsoft Networks, File and Printer Sharing, and IPv4 enabled). What I think: It seems awfully similar to the problems discussed in detail at http://social.msdn.microsoft.com/Forums/en/wsk/thread/1064e397-9d9b-4ae2-bc8e-c8798e591915 (which was both the most relevant and concrete information I could dig up on the internet). I still think that something the Windows 7 IP stack (or just Operating System itself) is doing is giving the router fits. However, I could be wrong, because I have two key differences. One is that most instances of this problem are reported as the entire router dying or restarting, and mine still works just fine over the wired connection. The other is that it's a new router, tested with both the factory firmware and the (I assume) well-maintained DD-WRT project. Even if Windows 7 is still secretly sending IPv6 packets or the TCP Window Scaling implementation that I hear Vista caused some trouble with (even though I've tried my best to disable anything fancy), this router should support those functions. I don't want to get a new or a replacement router unless someone can convince me that this is a defective unit. But the problem seems too specific and predictable by my instincts to be a hardware hiccup. And I don't want to deal with the inevitable problems that always seem to take half a day to resolve when getting a new router, since I'm frantically working (including tomorrow) to complete a project by next week's deadline. Plus, I think in the worst case scenario, I could keep this router connected directly to the modem, disable its wireless entirely, and connect the old Belkin to it directly. That should allow me to still use VPN (although I'll have to plug my work laptop directly into that router), and then maintain wireless connections for all of the other computers. But that feels so wrong to me. Anyone have any ideas what the cause and possible solution could be? Clarifications: The Windows 7 machine is directly connected via an ethernet cable to the router for everything above. But while it is online, all other computers' wireless connections become unusable. It is not an issue of signal strength or interference -- no other devices within scanning range are using Channel 1, and the problem will affect computers that are literally feet away from the router with 95% signal strength.

    Read the article

  • AD Stopping a Script and Writing a Value to a User's AD Account PPT Presentation

    - by Steven Maxon
    ‘This will launch the PPT in a GPO Dim ppt Set ppt = CreateObject("PowerPoint.Application") ppt.Visible = True ppt.Presentations.Open "C:\Scripts\Test.pptx" ‘This is the batch file at the end of the PPT that records the date, time, computer name and username echo "Logon Date:%date%,Logon Time:%time%,Computer Name:%computername%,User Name:%username%" >> \\servertest\g$\Tracking\LOGON.TXT ‘This is what I need but can’t find: I need the script to check a value in the Active Directory user’s account in the Web page: attribute that would shut off the script if the user has already competed reading the presentation. Could be as simple as writing XXXX. I need the value XXXX written to the Active Directory user’s account in the Web page: attribute when they finish reading the presentation after they click on the bat file so the script will not run again when they log in.

    Read the article

  • mod_cache serving the wrong content

    - by J. Pablo Fernández
    I'm trying to use mod_disk_cache to speed up a web site that is running on WordPress. Whenever I enable it with CacheEnable disk / and the rest being the stock Ubuntu configuration I start to get the wrong results. When I see the main page it's fine, but when I go to a specific post I get a RSS feed instead. Like if the cache is returning the wrong content. I've disabled my RewriteRules as it seems mod_cache doesn't work with that. I'm not even sure where to start to debug such a thing. Any ideas?

    Read the article

  • How2 display Image from DB in datagrid?

    - by Saverio Tedeschi
    Hope it's not yet been answered, but I've googled for quite a long time, and I've not been able to get it working. In brief, I've a SL4 page with a DataGrid filled accordingly to parameters passed from the previous page, so I fill the context in code, with a "RIA" query (w/parameters as well). So far, so good. I get the XAML declared columns as expected but the Photo templated column, which holds just an image control. I think I can use the LoadingRow event, but can some1 give me some code on how to get my goal (image is in in the context from DB)? Thanks in advance

    Read the article

  • Why does my 64-bit IIS app pool show 3 gigabytes more virtual memory than private memory?

    - by Brett
    I have an ASP.Net application that I am running on 64-bit IIS 6 on Windows XP x64. When I open performance counters after one page hit of a trivial page, I see a Private Bytes of about 88 megs, but a Virtual Bytes of about 3 Gigs. When I try the same thing with a VERY trivial ASP.Net app, I get the same result. We see something similar on Windows Server 2003 in production -- there it is an issue because we recycle when the virtual memory consumed outgrows a limit. Before we make any changes to our recycling settings, we'd like to answer the following questions: Why does the app pool grab such a large hunk of virtual memory? Is the amount of virtual memory headroom the app requests configurable? Thanks! Brett

    Read the article

  • ASP.NET MVC 2 - Account controller not found

    - by Chris
    Hi all, I've recently created an ASP.NET MVC 2 application, which works perfectly in the development environment. However, when I deploy it to the server (123-reg Premium Hosting), I can access all of the expected areas - except the Account controller (www.host.info/Account). This then attempts to redirect to the Error.aspx page (www.host.info/Shared/Error.aspx) which it cannot find. I've checked that all of the views have been published, and they're all in the correct place. It seems bizarre that two other controllers can be accessed with no problems, whereas the Account controller cannot be found. I have since renamed the AccountController to SecureController, and all of the dependencies, to no avail. The problem with not being able to find the Error.aspx page also occurs on the development environment. Any ideas would be greatly appreciated. Thanks, Chris

    Read the article

  • Colorbox class don't work with ajax/dom object

    - by almal
    i usually use colorbox tool for open "popup windows" in my page and all are fine. In my new project the situation is little different because i use js/ajax/dom for create dinamically my objects in handleRequestStateChange() function. After import js,jquery and css for colorbox, in the head of my js page i write: $(document).ready(function () { $(window).scroll(function () { //oP1 = document.createTextNode(posizione_menu.offsetTop); //divIpt.appendChild(oMtx1); $(".divHcss").css("position", "fixed").css("top", "0px").css("z-index", "999"); }); //Examples of how to assign the Colorbox event to elements $(".group1").colorbox({rel:'group1'}); $(".group2").colorbox({rel:'group2', transition:"fade"}); $(".group3").colorbox({rel:'group3', transition:"none", width:"75%", height:"75%"}); $(".group4").colorbox({rel:'group4', slideshow:true}); $(".ajax").colorbox(); $(".youtube").colorbox({iframe:true, innerWidth:640, innerHeight:390}); $(".vimeo").colorbox({iframe:true, innerWidth:500, innerHeight:409}); $(".iframe").colorbox({iframe:true, width:"80%", height:"80%"}); $(".inline").colorbox({inline:true, width:"50%"}); $(".callbacks").colorbox({ onOpen:function(){ alert('onOpen: colorbox is about to open'); }, onLoad:function(){ alert('onLoad: colorbox has started to load the targeted content'); }, onComplete:function(){ alert('onComplete: colorbox has displayed the loaded content'); }, onCleanup:function(){ alert('onCleanup: colorbox has begun the close process'); }, onClosed:function(){ alert('onClosed: colorbox has completely closed'); } }); $('.non-retina').colorbox({rel:'group5', transition:'none'}) $('.retina').colorbox({rel:'group5', transition:'none', retinaImage:true, retinaUrl:true}); //Example of preserving a JavaScript event for inline calls. $("#click").click(function(){ $('#click').css({"background-color":"#f00", "color":"#fff", "cursor":"inherit"}).text("Open this window again and this message will still be here."); return false; }); }); and after in handleRequestStateChange() i create my a element and assign to a div: var a = createElement('a'); //a.style.display = "block"; a.setAttribute('class','iframe'); a.setAttribute('href',"php/whois.php?P1="+oStxt.value); var divIp3 = createElement('div', 'divIp3', 'divIp3css'); var divIp31 = createElement('div', 'divIp31', 'divIp31css'); divIp3.appendChild(divIp31); divIp3.appendChild(a); a.appendChild(divIp31); The divIp31 become linkable but the href open page in a normal browser tab and not using attribute class for colorbox. Someone have an idea about? Thanks in advance AM

    Read the article

  • "tar -cfz" versus "tar cf - | gzip": are they different? (or how to improve a backup)

    - by I'm Dario
    I want to speed up my backup done with "tar -cfz", the common way to do it. But day by day my backed up files grow so it becomes slower. I was thinking to take advantage of the several cores available in my server and I was wondering if there is any difference between doing the backup with "tar -cfz" or piping tar to gzip ("tar cf - | gzip"). I guess that there isn't any difference, because the first spawns two processes (tar and gzip), in a similar way like piping it. If there is not difference, do you know any good alternative to do this, without going incremental? I'm looking at pigz too and it looks fine.

    Read the article

  • Unable to login magento administration

    - by SIA
    Hi Everybody, I have just installed Magento on Windows using WAMP. Installation was successfull without any errors or warnings. When i browse administration page, i can see login screen. After entering the correct credentials it is not displaying me the Dashboard/Control Panel. Its displaying the same login page. If i enter wrong credentials, its authenticating and displaying a message as "invalid username or password" I am unable to determine the issue. If anybody has been through this issue and solved it please help me. How can i login Magento administration. Note: While installation i have selected session=Database. Will that be a issue? Kindly Advice, SIA

    Read the article

  • Dedicated GPU in Dell PowerEdge C1100

    - by Eli Gundry
    We recently purchased a Dell PowerEdge C1100 off lease with the initention of using it for graphics processing. We installed an AMD HD 7000 series GPU in it that runs off of board power and it sends video to the display. That said, the video is very choppy, leading us to belive that the onboard video is doing all the processing and sending it to the card. Is there any way to either disable the VGA on this server or tell the OS to only use the dedicated card. More info: The server is running REHL 6.4 The graphics running the proprietary AMD drivers The video card only works in OS and does not show the BIOS on boot (we know that it's impossible to change this) Any ideas, guys? Update We are now thinking that the GPU is doing the graphics processing, but not working at the full speed of the PCI bus. Which is odd, because it is an x16 slot, but probably optimized to use a RAID card (if that makes any sense). Is there any way to remedy the choppy graphics on this server?

    Read the article

  • TinyMce + Ajax File Manager + Codeigniter = Little Problem

    - by lucha libre
    OK, I'm using the following: TinyMCE, CodeIgniter, and the TinyMCE Ajax File Manager. I can upload correctly and it looks pretty good. However, when I view the HTML (from TinyMCE), this is what I get. <img src="../../../data/page/verde_enfemera.jpg" alt="" /> What I need to be getting is the following: <img src="http://localhost/http/data/page/verde_enfemera.jpg" alt="" /> Can someone help? EDIT: I changed the code in the HTML editor of Tinymce, then I saved it. When I re-opened it, the code was reverted back to the original "../.../../data", etc. please, help, someone.

    Read the article

  • Run a PHP-script from a PHP-script without blocking

    - by Nicklas Ansman
    I'm building a spider which will traverse various sites and data mining them. Since I need to get each page separately this could take a VERY long time (maybe 100 pages). I've already set the set_time_limit to be 2 minutes per page but it seems like apache will kill the script after 5 minutes no matter. This isn't usually a problem since this will run from cron or something similar which does not have this time limit. However I would also like the admins to be able to start a fetch manually via a HTTP-interface. It is not important that apache is kept alive for the full duration, I'm, going to use AJAX to trigger a fetch and check back once in a while with AJAX. My problem is how to start the fetch from within a PHP-script without the fetch being terminated when the script calling it dies. Maybe I could use system('script.php &') but I'm not sure it will do the trick. Any other ideas?

    Read the article

  • Search Route in ASP.NET MVC

    - by olst
    Hi. I have a simple search form in my master page and a serach controller and view. I'm trying to get the following route for the string search term "myterm" (for example): root/search/myterm The form in the master page : <% using (Html.BeginForm("SearchResults", "Search", FormMethod.Post, new { id = "search_form" })) { %> <input name="searchTerm" type="text" class="textfield" /> <input name="search" type="submit" value="search" class="button" /> <%} %> The Controller Action: public ActionResult SearchResults(string searchTerm){...} The Route I'm Using : routes.MapRoute( "Search", "search/{term}", new { controller = "Search", action = "SearchResults", term = (string)null } ); routes.MapRoute( "Default", "{controller}/{action}", new { controller = "Home", action = "Index" } ); I'm always getting the url "root/search" without the search term, no matter what search term I enter. Thanks.

    Read the article

  • vb.net .aspxauth

    - by Morgan
    I am working with a large site trying to implement web parts for particular users in a particular subdirectory but I can't get the .ASPXAUTH cookie to be recognized. I've read dozens of tutorials and MS class library pages that tell me how it should work to no avail. I am brand new to Web parts, so I'm sorry if I'm unclear. The idea is that logged in users can travel the site, but then when they go to their dashboard, they are programmatically authenticated using Membership and FormsAuthentication to pull up their Personalization. When I step through the code, I can see the cookie being set, and that it exists on the following page, but Membership.GetUser() and User.Identity are both empty. I know the user exists because I created it programmatically using Membership.CreateUser() and I can see it when I do Membership.GetAllUsers() and it's online when i use Membership.GetUser(username) but the Personalization doesn't work. Right now, I'm just trying to get the proof of concept going. I've tried creating the ticket and cookie myself, and also using SetAuthCookie() (code follows). I really just need a clue as to what to look for. Here's the "login" page... If Membership.ValidateUser(testusername, testpassword) Then -- Works FormsAuthentication.SetAuthCookie(testusername, true) Response.Redirect("webpartsdemo1.aspx", False) End If And the next page (webpartsdemo1.aspx) Dim cookey As String = ".ASPXAUTH" lblContent.Text &= "<br><br>" & Request.Cookies(cookey).Name & " Details" lblContent.Text &= "<br>path = " & Request.Cookies(cookey).Path lblContent.Text &= "<br>domain = " & Request.Cookies(cookey).Domain lblContent.Text &= "<br>expires = " & Request.Cookies(cookey).Expires lblContent.Text &= "<br>Secure only? " & Request.Cookies(cookey).Secure lblContent.Text &= "<br>HTTP only? = " & Request.Cookies(cookey).HttpOnly lblContent.Text &= "<br>Has subkeys? " & Request.Cookies(cookey).HasKeys lblContent.Text &= "<br/><br/>request authenticated? " & Request.IsAuthenticated.ToString lblContent.Text &= " Getting user<br/>Current User: " Dim muGidget As MembershipUser If Request.IsAuthenticated Then muGidget = Membership.GetUser lblContent.Text &= Membership.GetUser().UserName Else lblContent.Text &= "none found" End If Output: .ASPXAUTH Details path = / domain = expires = 12:00:00 AM Secure only? False HTTP only? = False Has subkeys? False request authenticated? False Getting user Current User: none found Sorry to go on so long. Thanks for any help you can provide.

    Read the article

  • ISS Not working

    - by 3bd
    I have a web site that built on Visual studio 2008 and i need to run it from my computer (Win 7 Ultimate) as a server I tried to publish it to IIS and this is simply not working and i have the flowing error : Error Summary HTTP Error 500.19 - Internal Server Error The requested page cannot be accessed because the related configuration data for the page is invalid. Config Error This configuration section cannot be used at this path. This happens when the section is locked at a parent level. Locking is either by default (overrideModeDefault="Deny"), or set explicitly by a location tag with overrideMode="Deny" or the legacy allowOverride="false". any one can help?

    Read the article

< Previous Page | 647 648 649 650 651 652 653 654 655 656 657 658  | Next Page >