Search Results

Search found 30217 results on 1209 pages for 'website performance'.

Page 183/1209 | < Previous Page | 179 180 181 182 183 184 185 186 187 188 189 190  | Next Page >

  • Python Framework for small website

    - by mvid
    I am planning a small, simple website to showcase myself as an engineer. My preferred language is Python and I hope to use it to create my website. My pages will be mostly static, with some database stored posts/links. The site will be simple, but I would like to have freedom in how it operates. I plan on using CSS/JS for the design, so I really just need an easy way to throw a small amount of content around. Some frameworks I have come across: Flask cherry.py Pinax Are there any suggestions? Does anyone have any experience with Python on small/hobby websites?

    Read the article

  • What do you upgrade to make games load faster? [on hold]

    - by Superbest
    Let's say you have a relatively modern game like Shogun 2. The loading screens take several minutes. This bothers you and you'd like to improve it. What is actually going on when loading screens are up? I'm guessing assets are being loaded into memory from disk, and possibly being decompressed first. However, what is actually causing the slow down? The memory? Mainboard? CPU? HDD? If you had $100 to spend on upgrades and your only goal is to speed up loading screens without reducing other performance, what component of the computer does it make sense to upgrade for maximum benefit? If your answer is "it depends on the existing setup", what sort of benchmarks would you run to determine what is causing the bottleneck? What if you had $500 instead? I give the two budgets for context. I am not asking for actual recommendations about which component to buy (nor are the numbers supposed to be rigid limits), but what features are important when shopping for components with small and large budgets (a large budget could allow buying multiple components which are not so good on their own, but work particularly well together). I mention Shogun 2 as an example, but I'm asking about reducing overall loading times, across all games, not just one game. Therefore, "put it on a solid state disk" probably won't be good solution, because putting every game on your SDD will quickly fill it up.

    Read the article

  • Does AMD Cool n Quiet Slow Down Your System?

    - by Software Monkey
    I discovered today that having AMD Cool n Quiet enabled in my BIOS appears to be slowing down my Windows XP SP2 system by about 29% on memory & CPU intensive workloads. I was wondering if (a) anyone else had encountered this, (b) anyone can offer an explanation, (c) there are any negatives I need to be aware of if I keep AMD CnQ disabled. With some superficial testing so far, I don't immediately notice any difference with CnQ off (other than the performance being what I expected from this new hardware). It seems to ramp up the CPU fan a little bit as my program maxes out 1 core, but that's the same as with CnQ on. And when I let the system idle the CPU fan slows down and the systems as quiet as a mouse (after years of 6 small fans churning like they want to go into orbit it's nice to again have a system where I can hear the HDDs seeking). Bonus question: Does CnQ cause issues with system stability? I ask because the reason I disabled it was because I have had a few freezes and 1 spontaneous reboot with my new hardware.

    Read the article

  • Win 2008 R2 - copying TO disk is very slow, copying FROM is more or less okay

    - by avs099
    I have Windows 2008 R2 SP1 with 4 identical SATA disks (Seagate Barracude 7200) in RAID 5 array. It has 4Gb of memory; all recent updates are installed. Problem: when I copy large file from one folder to another, I get about 10MB/s average speed. When I read this file from network share via 1Gbps connection - I get about 25-30 MB/s. Both numbers seems to be low for me - but specifically I'm very frustrated with low write speed. there is no antivirus, no hyper-v, it's just a fileserver - i when i do my tests nobody else reads/write from it (we have only 4 people in a team, so I'm sure). Not sure if that matters, but there is only 1 logic disk "C" with all available space (1400 GB). I'm not an admin at all, so I have no idea where to look and what other information to provide. I did run performance monitor with "% idle time", "avg bytes read", "avg byte write" - here is the screenshot: I'm not sure why there are such obvious spikes. Any idea? Please let me know if you need me to provide more information - what counters should I check, etc. I'm very eager to get this solved. Thank you. UPDATE: we have another Windows 2008 R2 SP1 server with 2 RAID1 arrays - one is disk C (where windows is installed, another one is disk E). It is running Hyper-V and does not have antivirus. I noticed the following behavior when I copy large file (few GBs): C - C: about 50MB/sec C - E: about 55MB/sec E - E: 8MB/sec!!! E - C: 8MB/sec!!! what could cause this?? E drive is RAID1 array from same Seagate Barracuda 1TB drives..

    Read the article

  • Best Static Website Generator

    - by Nick Retallack
    In the age of dynamic websites built with layouts and templates, nobody wants to write plain old repetitive static html anymore. But now that you can outsource dynamic features to services like Disqus, and you could get slashdotted/dugg/reddited at any moment, sometimes a static website is best for scalability. There are quite a few static website generators out there that let you use templates, layouts, alternative markup languages, and other new age stuff. So this question is a bit of a survey. Which do you think is the best, and why? Here are a few examples to start us off: WebGen StaticMatic Static

    Read the article

  • Drupal - Lightbox -> iframe node displaying entire website with views

    - by kilrizzy
    I am attempting to make a view that would list thumbnails of my projects, then when clicking them enlarge the photo using lightbox and list out some text and a link to the website. I am not sure if there is a way I can just add text to the lightbox using views so right now I have it using a field for Lightbox2 iframe: thumb200wh-node page. Open my entire website again in the lightbox instead of just the node: http://jeffkilroy.com/portfolio_boxes Is there a way to just display the node from the views module or is there a way to just use an image but modify the description so that I can put text in?

    Read the article

  • How to give you website customers a secure feeling

    - by Saif Bechan
    I was wondering how you can give your website customers the confidence that you are not tinkering with the database values. I am planning on running a website which falls in the realm of an online game. There is some kind of credit system involved that people have to pay for. Now I was wondering how sites like this ensure there customers that there is no foul play in the database itself. As I am the database admin i can pretty much change all the values from within without anyone knowing i did. Hence letting someone win that does not rightfully is the winner. Is it maybe an option to decrypt en encrypt the credits people have so i can't change them. Or is there maybe a company i can hire that checks my company for foul play.

    Read the article

  • Setting the Hostname in IIS Bindings Breaks Website

    - by Josh
    I just got a Windows Server 2008 VPS and I'm having trouble getting IIS7 setup. I created a new website in IIS with the path, ip address, and hostname (like 'www.nameofsite.com') and click OK. When I browse to the site it pulls up "http://www.nameofsite.com" in the browser and... nothing... IE cannot display this webpage. If I blank out the hostname in the bindings and click [Browse] it works fine (it takes me to http://10.10.2.92 - the computer's ip). So entering the hostname breaks the website. Any ideas on what I'm missing? Services I might not have running or roles I'm missing? No server roles were initially installed on the VPS so I installed IIS, DHCP, DNS, and Application Server... overkill, but I wasn't sure what to install.

    Read the article

  • Basics for implementing SSL on PHP Website

    - by KoolKabin
    Hi guys, I am here as a developer of a website. My website got different modules among which one function is to process credit card. In order to process credit card I need to implement SSL layer and process the pages. For rest of modules the SSL is optional. Now my points are: 1.) Is the location of file for http and https same? 2.) Can the session of http and https be shared? this is required as i need user login information and cart item information.

    Read the article

  • Webfaction: How do I run a Static/Perl app and Django app under the same website

    - by swisstony
    I have an existing Perl app that I'm moving to a Webfaction website. I will be adding Django apps to this Webfaction website too. I would like the Django app to get first call and so would want its URL path to be / This would allow me to add any new URLs to the urls.py I wish as my app grows. If the URL doesn't match anything in the urls.py I would like it to get passed to the static Perl app. For example /app1 - Django /app2 - Django Everything else not picked up by urls.py I would want going to my Perl app For example: /index.html - Static/Perl app /about.html - Static/Perl app /contact.html - Static/Perl app /apps/perlapp1.cgi - Static/Perl app etc How do I go about achieving this in Webfaction?

    Read the article

  • Deploying only changed part of a website with git to ftp (svn2web for git)

    - by Elazar Leibovich
    I'm having a website with many big images file. The source (as well as the images) is maintained with git. I wish to deploy that via ftp to a bluehost-like cheap server. I do not wish to deploy all the website each time (so that I won't have to upload too many unchanged files over and over), but to do roughly the following: In a git repository, mark the last deployed revision with a tag "deployed". When I say "deploy revision X", find out which files has changed between revision X and revision tagged as deploy, and upload just them. It is similar in spirit to svn2web. But I want that for DVCS. Mercurial alternative will be considered. It's a pretty simple script to write, but I'd rather not to reinvent the wheel if there's some similar script on the web. Capistrano and fab seems to know only how to push the whole revision, in their SCM integration. So I don't think I can currently use them.

    Read the article

  • virtual disk image - file or partition

    - by tylerl
    I'm looking at the differences between using a file versus a partition to store a virtual disk image in VM use. The common knowledge is that partition-based images are faster than file-based images because of a decreased overhead. It makes sense, but I've never seen any actual numbers. My own testing bears out a different result. When I benchmark a direct-to-partition virtual disk, then format that same partition with ext4, create a virtual disk image stored on that ext4 filesystem, and then benchmark that, I see no speedup at all for the direct-to-partition virtual disk. Instead on some systems the file-based image is even faster (possibly due to host OS caching or something like that). This test was repeated many times on many systems, with fairly consistent results. So perhaps throwing out the performance justification, is it still considered better to use a partition rather than a virtual disk image? Is there some other reason why direct partition access is better than image files? Or perhaps is there some reason to go the other way around? Perhaps an advantage in one of the virtual disk file formats that you don't get with raw partition images?

    Read the article

  • Jad file download link for my website

    - by Jareim
    Hi sir/madam! I am putting up a small website via webs.com for me and my friends that could also be accessible via wap, i.e. mobile internet, and I want to add links to my site that downloads .jar files. I uploaded the files on my site, and links to the .jar files went fine, but I also need links for .jad files (for some mobile phones that require .jad files FIRST then .jar). I tried doing a regular link for the .jad files, but it simply displayed the content of the .jad file. It wasn't installed or downloaded. What should I do? Or am I at a wrong website? Thanks!

    Read the article

  • Server nearly unusable when doing disk writes

    - by Wikser
    My question closely relates to my last question here on serverfault. I was copying about 5GB from a 10 year old desktop computer to the server. The copy was done in Windows Explorer. In this situation I would assume the server to be bored by the dataflow. But as usual with this server, it really slowed down. At least I could work with the remote session, even there was some serious latency. The copy took its time (20min?). In this time I went to a colleague and he tried to log in in the same server via remote desktop (for some other reason). It took about a minute to get to the login screen, a minute to open the control panel, a minute to open the performance monitor, ... Icons were loading maybe one per second. We saw the following (from memory): CPU: 2% Avg. Queue Length: 50 Pages/sec: 115 (?) There was no other considerable activity on the server. The server seldom serves some ASP.NET pages, which became also very slow in this time. The relevant configuration is as follows: Windows 2003 SEAGATE ST3500631NS (7200 rpm, 500 GB) LSI MegaRAID based RAID 5 4 disks, 1 hot spare Write Through No read-ahead Direct Cache Mode Harddisk-Cache-Mode: off Is this normal behaviour for such a configuration? What measurements could give further clues? Is it reasonable to reduce the priority of such copy I/O and favour other processes like the remote desktop? How would you do that? Many thanks!

    Read the article

  • What does "warning: unable to unlink website: Operation not permitted" mean when checking out a Git

    - by James A. Rosen
    I'm trying to create a local branch that tracks a remote branch. Here's what I get: > git checkout master > git push origin origin:refs/heads/myBranch Total 0 (delta 0), reused 0 (delta 0) To [email protected]:myrepo/myproject.git * [new branch] origin/HEAD -> myBranch > git fetch origin > git checkout --track -b myBranch origin/myBranch warning: unable to unlink website: Operation not permitted Branch myBranch set up to track remote branch myBranch from origin. Switched to a new branch 'myBranch' What does "warning: unable to unlink website: Operation not permitted" mean? Did everything work fine?

    Read the article

  • preformance wise htaccess

    - by purpler
    hese's the my htaccess template, i wonder if anything could be added to increase website performance.. # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US ServerSignature Off FileETag None Header unset ETag Options -MultiViews #Options All -Indexes # Force the latest IE version or ChromeFrame <IfModule mod_setenvif.c> <IfModule mod_headers.c> BrowserMatch MSIE ie Header set X-UA-Compatible "IE=Edge,chrome=1" env=ie </IfModule> </IfModule> # Proxy X-UA Setup <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> #Rewrites Options +FollowSymlinks RewriteEngine On RewriteBase / # Redirect to non-WWW RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] # Redirect to WWW RewriteCond %{HTTP_HOST} ^domain.com RewriteRule (.*) http://www.domain.com/$1 [R=301,L] # Redirect index to root RewriteRule ^(.*)index\.(php|html)$ /$1 [R=301,L] # Caching ExpiresActive On ExpiresDefault A0 Header set Cache-Control "public" # 1 Year Long Cache <FilesMatch "\.(flv|fla|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav|png|jpg|jpeg|gif|swf|js|css|ttf|eot|woff|svg|svgz)$"> ExpiresDefault A31622400 </FilesMatch> # Proxy Caching <FilesMatch "\.(css|js|png)$"> ExpiresDefault A31622400 Header set Cache-Control "private" </FilesMatch> # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Proper SVG serving AddType image/svg+xml svg svgz AddEncoding gzip svgz # GZip Compression <IfModule mod_deflate.c> <FilesMatch "\.(php|html|css|js|xml|txt|ttf|otf|eot|svg)$" > SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error page ErrorDocument 404 /404.html # Deny access to sensitive files <FilesMatch "\.(htaccess|ini|log|psd)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • Facebook Social Plugins Like button returns "Website Inaccessible"

    - by buggedcom
    We've just added Facebook like button to http://www.willyoung.co.uk/global/songs-and-lyrics/releases/the_hits?page=1, however the json returned by facebook is for (;;);{"error":0,"errorSummary":"","errorDescription":"","errorIsWarning":false,"silentError":0,"payload":{"requires_login":false,"success":false,"already_connected":false,"is_admin":false,"show_error":true,"error_info":{"brief":"Website inaccessible","full":"The page at http:\/\/www.willyoung.co.uk\/global\/songs-and-lyrics\/releases\/the_hits could not be reached."}}} It basically says that the website is not accessible. We've implemented this on other sites and it's fine. I'm not really sure where this is coming from. Any ideas?

    Read the article

  • Fast distributed filesystem for a large amounts of data with metadata in database

    - by undefined hero
    My project uses several processing machines and one storage machine. Currently storage organized with a MSSQL filetable shared folder. Every file in storage have some metadata in database. Processing machines executes tasks for which they needed files from storage and their metadata. After completing task, processing machine puts resulting data back in storage. From there its taken by another processing machine, which also generates some file and put it back in storage. And etc. Everything was fine, but as number of processing machines increases, I found myself bottlenecked myself with storage machines hard drive performance. So I want processing machines to put files in distributed FS. to lift load from storage machines, from which they can take data from each other, not only storage machine. Can You suggest a particular distributed FS which meets my needs? Or there is another way to solve this problem, without it? Amounts of data in FS in one time are like several terabytes. (storage can handle this, but processors cannot). Data consistence is critical. Read write policy is: once file is written - its constant and may be only removed, but not modified. My current platform is Windows, but I'm ready to switch it, if there is a substantially more convenient solution on another one.

    Read the article

  • Should I partition a 1TB Hard Disk whose primary use is media storage?

    - by Senthil
    I am going to get a 1TB hard disk. I will be storing 1080p or 720p movies, high-bitrate music and pictures in it. I use my PC 90% of the time only to play/listen/see those. I am running out of space in my current HD so I am getting another one. My specs are 2.7GHz Dual Core, 512MB GeForce 9400GT, 2GB DDR2 RAM and all the proper matroska codecs/players. I guess that is enough to play 1080p movies withough a glitch, given an ideal hard disk. I've read about proper partitioning giving performance improvement etc.. I don't want my hard disk to be the bottleneck. Can someone tell me whether I should partition my 1TB hard disk into many drives? If I should, what is the ideal size of each partition? Smooth playing of movies is very important to me. Once I start filling up the disk, there is no turning back. So I want to get it right before I start. Thanks.

    Read the article

  • My processor is running slower than usually it has to run

    - by Soham
    I've Core2Duo E7400 2.80GHz processor on my Intel D945gcnl mobo. From CPU-Z, I've get to know that my processor speed is 1596MHz with X6 multiplier and 266MHz Bus Speed on each core. Why my processor is being operated at 1596 MHz rather than 2.80GHz...!!???? From my side I've tried to disable SpeedStep from my bios by setting EIST to 'Disable' and also tried to change Power Option to 'High Performance' in Windows 7. And also done like suggested in this question:http://superuser.com/questions/119176/processor-not-running-at-max-speed But it gains me nothing. I've also tried to run few massive applications together to check whether it was increasing at that time or not, but it remains same. Should I have to increase my multiplier or overclock to gain that lost speed...??? Should I have to check my power supply for any problem..??? or anything else...??? Please help me on this.... And yeah I've desktop computer so no problem causing by battery. Here's my CPU-Z Screenshot: http://i56.tinypic.com/2lk4mqc.jpg

    Read the article

  • Update website with a single command (git push) instead of FTP drag and dropping

    - by Wolfr
    Situation: I have a local copy of a website I have a server that I have SSH access to What do I want to do? Commit locally until I'm happy with my code Make branches locally Have one master branch that is the one that should be pushed to the server Update the website using a single command (git push origin master) If I set up a git repo locally using git init, and then push to a folder on the server, it doesn't work. When I FTP to the server to check the files, they're actually there. When I SSH into the server and do git status, it's not clean, even though it should be since I just pushed to the server. Steps I'm doing: Make a new folder on my computer (mkdir folder_x) Go into that folder (cd folder_x) Set up a new git repository there (git init) (git repository sets up successfully) Push the repository to the server using git push origin master (where origin is set up as user:[email protected])

    Read the article

  • How can I improve this search usability?

    - by Craig Whitley
    This is the first real programming attempt of mine, and theres some major flaws. It's a learning project, and I'm currently re-writing the entire thing as my php is is really messy. I really want to get an idea on how I can improve the actual usability and accesibility of the site at the same time though - so I know how to implement it correctly. The website is basically a comparison website for gameserver hosting. As I mentioned, its a learning project and I don't actually expect any revenue from it. At the moment theres only test data in it, so in the game input box select either 'Battlefied Bad Company 2' or 'Call of Duty 4: Modern Warfare' and ignore the actual search results. http://www.laglessfrag.com I wasn't really sure how to work the search functionality. Basically when you click a game in the drop down box, it sends an ajax request and finds all the locations available to that specific game in the database. After selecting the country theres another ajax to find all the citys available to the game in that country - which gives me the two unique identifiers I need to create the search results. One major and fundamental flaw is that without javascript enabled, the site ceases to function. I'll overcome that in the next re-write, but without the ajax functionality stopping the user 'going wrong' - how can I implement a search that requires two fields without creating extra steps in new pages after form submissions? I'm also no designer so my whole layout and css is a bit rubbish, but this was mainly a learning project as I'm interested in applications / programming rather than design. It's also slow as its on shared hosting, but if I can get it to work correctly then I'm not impartial to chucking a bit of money at it for faster hosting and maybe a bit of advertising and seeing where it goes (if anywhere!). Any info appreciated.

    Read the article

  • Can a pool of memcache daemons be used to share sessions more efficiently?

    - by Tom
    We are moving from a 1 webserver setup to a two webserver setup and I need to start sharing PHP sessions between the two load balanced machines. We already have memcached installed (and started) and so I was pleasantly surprized that I could accomplish sharing sessions between the new servers by changing only 3 lines in the php.ini file (the session.save_handler and session.save_path): I replaced: session.save_handler = files with: session.save_handler = memcache Then on the master webserver I set the session.save_path to point to localhost: session.save_path="tcp://localhost:11211" and on the slave webserver I set the session.save_path to point to the master: session.save_path="tcp://192.168.0.1:11211" Job done, I tested it and it works. But... Obviously using memcache means the sessions are in RAM and will be lost if a machine is rebooted or the memcache daemon crashes - I'm a little concerned by this but I am a bit more worried about the network traffic between the two webservers (especially as we scale up) because whenever someone is load balanced to the slave webserver their sessions will be fetched across the network from the master webserver. I was wondering if I could define two save_paths so the machines look in their own session storage before using the network. For example: Master: session.save_path="tcp://localhost:11211, tcp://192.168.0.2:11211" Slave: session.save_path="tcp://localhost:11211, tcp://192.168.0.1:11211" Would this successfully share sessions across the servers AND help performance? i.e save network traffic 50% of the time. Or is this technique only for failovers (e.g. when one memcache daemon is unreachable)? Note: I'm not really asking specifically about memcache replication - more about whether the PHP memcache client can peak inside each memcache daemon in a pool, return a session if it finds one and only create a new session if it doesn't find one in all the stores. As I'm writing this I'm thinking I'm asking a bit much from PHP, lol... Assume: no sticky-sessions, round-robin load balancing, LAMP servers.

    Read the article

  • Performance monitor shows 4294967293 sessions active

    - by TGnat
    I have an ASP.Net 3.5 website running in IIS 6 on Windows Server 2003 R2. It is a relatively small internal application that probably serves less than ten users at any given time. The server has 4 Gig of memory and shows that 3+ Gig is available while the site is active. Just minutes after restarting the web application Performance monitor shows that there is a whopping 4,294,967,293 sessions active! I am fairly certain that this number is incorrect; at the time this reading there were only 100 requests to the website. Has anyone else experienced this kind odd behavior from perf mon? Any ideas on how to get an accurate reading? UPDATE: After running for about an hour the number of active sessions has dropped by 4. So it does seem to be responding to sessions timing out.

    Read the article

< Previous Page | 179 180 181 182 183 184 185 186 187 188 189 190  | Next Page >