Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 57/1133 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • Sempron 145 core unlocking

    - by Grant
    Call me a cheapskate looking for performance, but I have a Sempron 145 processor that has been unlocked to an Athlon II X2 4450e with a GIGABYTE GA-990FXA-UD3 motherboard, but only one of the two processor cores can be used at a time. If both of the cores are enabled in the BIOS the kernel will hang during boot, If I reboot the system and disable the second core (but not re-lock the processor), it will take me to the grub and then start normally. Is there a way to enable both cores and boot without kernel hangs?

    Read the article

  • What would be topic for research in on edge of multiple processors / computers programming?

    - by Kabumbus
    I mean what is not already there? What can be developed in fiew month and give a breakthrue/ start a new leap in science of f multiple computers programming? What i see is already there MPI/ Bit torrent/Jabber protocols / APIs / servers for messaging LAN / wire and other infrastractural cabels for connecting Boost and analogs on evry OS in most languages for multithreading there are lots of CUDA like on computer frameworks for fast calculating on computers GPUs What I personally do not see out there is a crossplatform framework for multiple processes interaction. Meaning one that would allow easy creation of multyple processes running in paralell inside one hoster app on one machine. In level not harder than needed for threads creation (so no seprate server apps - just one lib doing it all) Is there ny such lib and what can you propose for research topic?

    Read the article

  • Will we be penalized for having multiple external links to the same site?

    - by merk
    There seem to be conflicting answers on this question. The most relevant ones seem to be at least a year or two old, so I thought it would be worth re-asking this question. My gut says it's ok, because there are plenty of sites out there that do this already. Every major retailer site usually has links to the manufacturer of whatever item they are selling. go to www.newegg.com and they have hundreds of links to the same site since they sell multiple items from the same brand. Our site allows people to list a specific genre of items for sale (not porn - i'm just keeping it generic since I'm not trying to advertise) and on each item listing page, we have a link back to their website if they want. Our SEO guy is saying this is really bad and google is going to treat us as a link farm. My gut says when we have to start limiting user useful features to our site to boost our ranking, then something is wrong. Or start jumping through hoops by trying to hide text using javascript etc Some clients are only selling 1 to a handful of items, while a couple of our bigger clients have hundreds of items listed so will have hundreds of pages that link back to their site. I should also mention, there will be a handful of pages with the bigger clients where it may appear they have duplicate pages, because they will be selling 2 or 3 of the same item, and the only difference in the content of the page might just be a stock #. The majority of the pages though will have unique content. So - will we be penalized in some way for having anywhere from a handful to a few hundred pages that all point to the same link? If we are penalized, what's the suggested way to handle this? We still want to give users the option to go to the clients site, and we would still like to give a link back to the clients site to help their own SE rankings.

    Read the article

  • Middle-click does nothing but makes window controls appear

    - by hleinone
    Just did a fresh install of Precise Pangolin on my laptop and noticed that the middle-click (actually three finger-tap on the touchpad) on Firefox doesn't work as it used to. When doing it on a link it doesn't get opened on a new tab, in fact, it doesn't get opened at all. Only the (useless) window size and position controls appear, as demonstrated on terminal in the following screen shot. How do I get my tab-opening middle-clicks back?

    Read the article

  • Develop for Desktop and Mobile at once

    - by Hola Soy Edu Feliz Navidad
    I need to develop a programm that is going to read some information from a special USB device and access information via Web Socket (socket.io preferred). My client wants to deploy this app on Mac OS X, Windows, Android and iOS. I was looking for development softwares valid for both: Desktop and Mobile, but it is still not clear for me and that´s what I found. Unity - Looks very good but to create a native plugin for reading from the device it lloks like I need the PRO edition. Adobe Air - Looks good but I´m not sure if I can write Native Plugins for all wanted platforms. LiveCode - Still inmature (lots of bugs) and I don´t know if it is possible to write native plugins for my device. Is there any other good platform for my case? What´s your option?

    Read the article

  • How do I make 3finger multitouch work on Samsung 530?

    - by RiaD
    I have Samsung 530U4C. How can I configure 3-finger gestures to work? Choosing next photo using 3 finger "scrool" worked in Windows 7. If I use synclient TapButton3=2 I may use it as medium mouse button, but there's problem: it gets cancelled after reboot, and as I read over the Internet, sometime else. Moreover, it would be great to see all gestures my laptop support and configure them all as I need. I find some info about touchegg, but I didn't manage to understand what it exactly is and even run it. Two-finger gestures works fine(configured using System setting menu)

    Read the article

  • In a multithreaded app, would a multi-core or multiprocessor arrangement be better?

    - by Michael
    I've read a lot on this topic already both here (e.g., stackoverflow.com/questions/1713554/threads-processes-vs-multithreading-multi-core-multiprocessor-how-they-are or http://stackoverflow.com/questions/680684/multi-cpu-multi-core-and-hyper-thread) and elsewhere (e.g., ixbtlabs.com/articles2/cpu/rmmt-l2-cache.html or software.intel.com/en-us/articles/multi-core-introduction/), but I still am not sure about a couple things that seem very straightforward. So I thought I'd just ask. (1) Is a multi-core processor in which each core has dedicated cache effectively the same as a multiprocessor system (balanced of course for processor speed, cache size, and so on)? (2) Let's say I have some images to analyze (i.e., computer vision), and I have these images loaded into RAM. My app spawns a thread for each image that needs to be analyzed. Will this app on a shared cache multi-core processor run slower than on a dedicated cache multi-core processor, and would the latter run at the same speed as on an equivalent single-core multiprocessor machine? Thank you for the help!

    Read the article

  • IIS mystery: "Deadlock detected" periodically makes site unavailable

    - by jskunkle
    A few times a day, our vb.net (IIS 6.0) website is randomly throwing the following error and becomes completely unavailable for 5-15 minutes at a time while the application is recycled: ISAPI 'c:\windows\microsoft.net\framework\v2.0.50727\aspnet_isapi.dll' reported itself as unhealthy for the following reason: 'Deadlock detected'. The website ran for months on the exact same server in beta without problem - but the problem started over the weekend when we made the site live. The live site is under some load but less than many of our other production websites. How should I attack this problem? I've looked into orphaning the worker process and creating a dump file - but I'm not sure how to analyze that. Any advice or information is appreciated. Thanks, Shane

    Read the article

  • Should I host my site on my own hardware for streaming media site?

    - by Reddy S R
    Hi, We are developing a new movie review site, more or less similar to RottenTomatoes. Now since there will be a lot of streaming of movie trailers and we are expecting medium traffic, do you think 3rd party web hosting will cost a lot? Should we rather go for our own hardware server software? We expect around 10GB of streaming to happen per month from 2 - 6 months of web site launch. Less before and more after that period. What do you suggest? Thanks Sridhar Reddy

    Read the article

  • CDN or seperate site to store static content?

    - by marty
    If I understand this correctly, I have two options for static files: Use a CDN and throw all my static files on it. Use a separate domain just to store the static files so users can download it simultaneously. So I assume it is either one to choose or do i need a separate domain then use a CDN to get files from that domain? Because I assume even if I have a CDN I still need to have a local copy of the static content somewhere either on my main site or a static content site like static.domain.com?

    Read the article

  • Access denied error when running site with SSL

    - by Gonzalo
    i've setup a SSL certificate to use in a website i'm working on. The problem is that when "Require SSL" is checked in iis, i get the following error while trying to access the site: 403 - Forbidden: Access is denied. You do not have permission to view this directory or page using the credentials that you supplied. If that checkbox is not checked, the site works fine (i'm not sure why, but i can even access it through SSL). Not sure if it makes any difference, but my company has an ISA server that we use as a proxy / firewall. Thanks, Gonzalo

    Read the article

  • Wordpress site on EC2 instance suddenly superslow

    - by Emil
    Set up a wordpress page the other day following this guide. The site was up and running, loading quickly and all was well, until today. Suddenly, loading the site takes forever and doesn't even work properly, the page shows up in an incomplete fashion. I tried rebooting the instance but that didn't help. The only actions I've taken on the server is to create an elastic IP, and to point a domainname to that IP. But I don't see how that could've slowed down the page. Any thoughts on what could have caused this and on a solution to the problem?

    Read the article

  • Sharepoint 2010 site access denied for Active Dirtectory group member

    - by Mia
    I created a blank site in Sharepoint 2010 and in Site Actions-->Permissions I removed all the users and added an Active directory group which has me as member and few others. After this I logged in as myself and it does not show that Portal on Left navigation. If I try to browse to that portal as myself it says "access denied"? I don't know where I am wrong. I am stuck from yesterday. If some one could help it would be great. Thanks.

    Read the article

  • Replace Whole Site by FTP

    - by Sam Machin
    Hi There, I've got a set of tools which periodically (about once a day) generate a complete set of static HTML pages for a site with associated folder structure etc. I then need to put those file onto the production server, my problem is that the server runs IIS(6 I think) and I only have regular FTP access. I need a way to automate the process of publishing the new site and it needs to a total replacement of the files each time its published, eg delete the whole folder & contents then put the new ones up. My source server is a ubuntu machine and I've got total control at that end, I have tried using CurlFTpFS but it seems to be too slow for what I'm trying to do and locks up. Rgds Sam

    Read the article

  • Invalid BOOT.INI (dual boot XP with 7)

    - by Muxa
    I had Windows XP x64 as my main system, and i also had a second partition with Windows XP x64. Both booted from first partition (C:) I then installed Windows 7 Ultimate on the first partition. I've added NTLDR using BCDRDIT. I've also copied NTLDR, NTDETECT.COM and BOOT.INI onto the drive where XP remained. However then i try to boot into Windows XP x64 i get Invalid BOOT.INI file Booting from c:\windows\ NTDETECT failed I found instructions on how to fix it using a boot disk, however the partitions are on a software RAID. I've tried to boot from a customized XP CD with the drivers, however it does not offer me a Repair option for some reason - just setup. Partitions that i have:'= System Reserved Main (Windows 7) Secondary (Windows XP x64) Here's the contents of my BOOT.INI: [boot loader] timeout=30 default=multi(0)disk(0)rdisk(0)partition(3)\WINDOWS [operating systems] multi(0)disk(0)rdisk(0)partition(3)\WINDOWS="Windows XP Professional x64 Edition" /fastdetect

    Read the article

  • Site to Site VPN with ISA 2006 to a Dyndns Hostname?

    - by Klaus
    Hi all, i would like to create a site-to-site VPN between my ISA 2006 and a DLink router on the other side. My ISA got a fixed external IP Address, but the Dlink only has a dynamic one. So it makes use of DynDNS. Every "cheap router" supports making VPNs to a hostname. But in ISA 2006 I have to enter a IP Address in the VPN settings. Is there any way to create the VPN connection to a Hostname? Thank you for answers! Kind regards, Klaus

    Read the article

  • Mutli-processor workstation as a workstation/server

    - by posdef
    I work in a research institute and a number of programs we use are computationally intensive (I actually wrote one of them). Right now we have one computer that is dedicated for one of these programs (with local accounts only, as in users physically sitting in front of that pc) and the other programs are run on individual workstations assigned to people. I have been looking around to common brands such as Dell and HP, for a some sort of a small/medium scale server, which can be used as a workhorse by sending tasks remotely. It appears as if there is nothing in between workstations with one 6-core processor and a bunch of extras (like fancy graphics etc) and rack mount servers with ridiculous amount of RAM and HDD expansion capabilities but still relatively little number of processors/cores. I wonder if what I am looking for is such a small niche product? Are there other solutions that I might not be aware of? Does anyone know of a multi proc- multi-core workstation/server that is still within the reasonable

    Read the article

  • new web site on windows 2008 server with IIS7 - does not work

    - by user22817
    Hi guys, I have a new domain: www.biografica.ro which was bought 3 months ago but never used still then. I've bought a server with Windows 2008 server instaWeb Server (IIS). I've added a new site in C:\inetpu\wwwroot directory and did the setting (assigned the default ip to www.biografica.ro host etc -i've did on IIS6 one year ago, so i think i know to set up it correctly)... The problem is that the default site created by IIS instalation is working, but mine is not. It is started but is says: This link appears to be broken in Chrome and "The webpage cannot be found" (in IE). Do you know guys what i;ve done wrong? As i know a domain takes time to propagate but i think locally it should work.. Please help...i've spent 3 hours and cannot find a way...:(

    Read the article

  • Best choice for off-site backup: dd vs tar

    - by plok
    I have two 1TB single-partition hard disks configured as RAID1, of which I would like to make an off-site backup on a third disk, which I am still to buy. The idea is to store the backup at a relative's house, considerably far away from my place, in the hope that all the information will be safe in the case of a global thermonuclear apocalypse. Of course, this backup would be well encrypted. What I still have to decide is whether I am going to simply tar the entire partition or, instead, use dd to create an image of the disks. Is there any non-trivial difference between these two approaches that I could be overlooking? This off-site backup would be updated no more than two or three times a year, in the best of the cases, so performance should not be a factor to be pondered at all. What, and why, would you use if you were me? dd, tar, or a third option?

    Read the article

  • windows php curl install : recommend a good site?

    - by phill
    So I'm struggling to get php curl installed on my windows xp professional machine and I've probably tried 5 different sites which either dont' work or refers to missing file references like the ca certificates and such. I'm looking to write a php script which logs into a site ssl, captures the page data using regex and emailing it to me. Before I can get there, I need ssl curl. I was wondering if someone can recommend a better site or tutorial which effectively walks me through that step by step. thanks in advance.

    Read the article

  • CPU spikes on small site- possibly apache or php config related

    - by Mike
    Hello, I hope you can help me. I have a site that I'm moving to a new datacenter. The server is pretty much vanilla, no control panel, and also no optimizations. When I hit a page, the site takes an extremely long time to load, despite it being relatively light weight. I ran top to see what was happening, and the cpu jumps to 75%, and drops back down to about 20% while the rest of the page is loading. Someone suggested that I ran lsof -p on the offending processes, but I'm not sure what I'm looking at. I ran through my httpd.conf file and commented out a bunch of loaded modules that didn't seem necessary, but that didn't help either. Anyone have any ideas? Output of the lsof http://pastebin.com/mfa113f

    Read the article

  • Tracking Unique site Views for 2012 - Not my website

    - by user580950
    I am in trouble. I placed and advt on a website in 2012 which said he has 950,000 unique visits each month so early in 2012 i advertised with them. The advertised didn't worked out so checked in 2-3 months time and i saw that the unique visitors on their site was 8,000 at that time.I immediately close the account I dont remember which site i was checking the unique visitors.That advt company has filed a dispute against me. So is there any tool that give me stats of 2012 of any website. i tried google trends but it doesnt show statistics ..

    Read the article

  • Memory upgrade - is the site reliable?

    - by Yuval
    Hi, I have a late-2008 unibody macbook model with 2 GB of ram. I am looking to upgrade to 4 GB of ram. I looked about a month ago at Other World Computing's 4 GB upgrade kit and I remember it being around $80. I looked today, finally getting to buy it and it went up to almost $100. I found another site, memoryupgrade.pro that calls itself "Pro memory upgrade" and it looks legitimate - it sells the memory for around $80 in its own brand. The only thing is, I haven't been able to find any reviews about it, and I'm not sure if it actually is reliable. Does anybody have any experience with this site? Does anybody have any other suggestions for buying macbook memory? I have friends who bought from OWC and were happy, should I just spend the extra $30 (including shipping) and buy from them? Thanks!

    Read the article

  • Using wget to download pdf files from a site that requires cookies to be set

    - by matt74tm
    I want to access a newspaper site and then download their epaper copies (in PDF). The site requires me to login using my email address and password and then it permits me to access those PDF URLs. I'm having trouble 'setting my session' in wget. When I login into the site from my browser, it sets two cookie values: [email protected] Password=12345 I tried: wget --post-data "[email protected]&Password=12345" http://epaper.abc.com/login.aspx However, that just downloaded the login page and saved it locally The FORM on the login page has two fields: txtUserID txtPassword and radiobuttons like this: <input id="rbtnManchester" type="radio" checked="checked" name="txtpub" value="44"> Another button: <input id="rbtnLondon" type="radio" name="txtpub" value="64"> If I post this to the login.aspx page, I get the same output wget --post-data "[email protected]&txtPassword=12345&txtpub=44" http://epaper.abc.com/login.aspx If I do: --save-cookies abc_cookies.txt it doesnt seem to have anything other than the default content. For the last if I do --debug as well it says: ... Set-Cookie: ASP.NET_SessionId=05kphcn4hjmblq45qgnjoe41; path=/; HttpOnly ... Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId 05kphcn4hjmblq45qgnjoe41 Length: 107253 (105K) [text/html] Saving to: `login.aspx' ... Saving cookies to abc_cookies.txt. However, abc_cookies.txt shows ONLY the following: # HTTP cookie file. # Generated by Wget on 2011-08-16 08:03:05. # Edit at your own risk. (Not sure why I'm not getting any responses on SO - perhaps SU is a better forum - http://stackoverflow.com/questions/7064171/using-wget-to-download-pdf-files-from-a-site-that-requires-cookies-to-be-set) EDIT 1 C:\Temp>wget --cookies=on --keep-session-cookies --save-cookies abc_cookies.txt --post-data "txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7" http://epaper.abc.com/login.aspx --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. --2011-08-18 08:15:59-- http://epaper.abc.com/login.aspx Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00a2ae80 (new refcount 1). ---request begin--- POST /login.aspx HTTP/1.0 User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Content-Type: application/x-www-form-urlencoded Content-Length: 100 ---request end--- [POST data: txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7] HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:17 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 Set-Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 107253 ---response end--- 200 OK Registered socket 300 for persistent reuse. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 Length: 107253 (105K) [text/html] Saving to: `login.aspx.1' 100%[======================================================================================================================>] 107,253 24.9K/s in 4.2s 2011-08-18 08:16:05 (24.9 KB/s) - `login.aspx.1' saved [107253/107253] Saving cookies to abc_cookies.txt. Done saving cookies. C:\Temp>wget --referer=http://epaper.abc.com/login.aspx --cookies=on --load-cookies abc_cookies.txt --keep-session-cookies --save-cookies abc_cookies.txt http://epaper.abc.com/PagePrint/16_08_2011_001.pdf --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 --2011-08-18 08:16:12-- http://epaper.abc.com/PagePrint/16_08_2011_001.pdf Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00598290 (new refcount 1). ---request begin--- GET /PagePrint/16_08_2011_001.pdf HTTP/1.0 Referer: http://epaper.abc.com/login.aspx User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145 ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 content-disposition: attachement; filename=Default_logo.gif Cache-Control: private Content-Type: image/GIF Content-Length: 4568 ---response end--- 200 OK Registered socket 300 for persistent reuse. Length: 4568 (4.5K) [image/GIF] Saving to: `16_08_2011_001.pdf' 100%[======================================================================================================================>] 4,568 7.74K/s in 0.6s 2011-08-18 08:16:14 (7.74 KB/s) - `16_08_2011_001.pdf' saved [4568/4568] Saving cookies to abc_cookies.txt. Done saving cookies. Contents of abc_cookies.txt epaper.abc.com FALSE / FALSE 0 ASP.NET_SessionId owcrje55yl45kgmhn43gq145

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >