Search Results

Search found 13022 results on 521 pages for 'antivirus tools'.

Page 42/521 | < Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >

  • install pymedia and python audio tools

    - by aaron
    I noticed a pattern of errors while trying to install PyMedia and Python Audio Tools. For both modules I run the following: $ python setup.py install Then I get a series of compilation errors, and then this: lipo: can't figure out the architecture type of: /var/folders/Kx/Kxxj4868HGi6VMhZLPyZN++++TI/-Tmp-//cch1y9AO.out error: command '/usr/bin/gcc-4.2' failed with exit status 1 I'm running Mac OS X 10.5, and this happens whether I'm using gcc-4.0 or gcc-4.2, Mac-Python 2.5 or 2.6, and MacPorts-Python 2.6. What's going on?

    Read the article

  • Amazon S3 tools for Debian?

    - by Jonik
    I need to (programmatically, in a shell script) upload an EAR file to an Amazon S3 bucket on Debian (5.0.4). What, if any, Debian package provides simple, scriptable tools for that? (I want raw S3 bucket access, so please don't suggest solutions like Jungle Disk.)

    Read the article

  • database scanning tools recommendation?

    - by stock99
    Lately my boss is asking me to looking into database scanners. I know very little about db scanning technology. Just wondering if it is worth to purchase tools like Sentrigo ( http://www.mcafee.com/us/about/sentrigo.aspx ) ? Does anyone have experience of using database scanner? Also, I'd like to know if there is any limitation on it. We already conducting internal OS and web scan on regular basis, by the way.

    Read the article

  • What constitutes a "substantial, good-faith effort to remove the links"

    - by Luke McCallum
    We engaged the services of a 3rd party SEO consultant to assist us in managing our Meta data and to write regular blogs on our site http://cyberdesignworks.com.au Without our authorisation, the SEO also ran a link building campaign which has seen us Penguin slapped and we no longer appear in Google for a number of our core keywords. Since notification by Google that we have "unnatural links" back in March we have undertaken a significant campaign to rid ourselves of these dodgy backlinks by a number of methods. I have just received feedback on my 4th or 5th resubmission which is still advising that we need to make a "substantial, good-faith effort to remove the links" before Google will reconsider us for inclusion. After the effort that I have gone through to get links removed, I am now at a loss as to what else I can do to demonstrate "substantial, good-faith effort to remove the links". Below is a summary of the actions that we have taken to date. According to http://removem.com we had about 5584 back-linking domains. Of those we have successfully contacted and had removed links from 344 domains We ignored links from 625 domains as they were either legitimate press releases, natural backlinks or client websites containing an attribution link in the footer that points back to us. Due to our efforts, or the sites simply becoming defunct, removem.com reports that links from 3262 domains have been removed. We have contacted but are yet to receive feedback from 1666 domains so we can assume that the backlinks remain. We have configured an automatic 301 redirect for each of the links from these 1666 domains to point to http://redirects.sanscode.com/ which we are calling our Bad Link Catcher (a stroke of genius I thought). i.e http://www.mysimplewebdesign.com/create-a-perfect-webpage-with-four-important-tips-from-sydney-web-development-service-companies.php As we are a web design agency, we have a large number of client websites which contain an attribution link in their footer which points back to us. We have gone through the vast majority of these and updated these links to replace anchor text with an image and rel="nofollow" link. i.e <a rel="nofollow" target="_blank" href="http://www.cyberdesignworks.com.au/"><img src="https://sessions.sanscode.com/site/assets/media/badges/Badge_CDW_SANSCODE.png"></a> See http://www.milkatwork.com.au/ An export from http://removem.com detailing the number of times we have contacted each link and whether it is still found or not was also supplied with each resubmission. The total back links reported in Google Web Master Tools has dropped from over 100K to 87K and I expect it to drop significantly lower once Google re-crawls each back-linking page. Based on all of the above, I am not sure what else I can do to to demonstrate a "substantial, good-faith effort to remove the links". I would sincerely appreciate any feedback or suggestions that you may have as I am out of ideas.

    Read the article

  • Problem installing SQL Server client tools

    - by Shiraz Bhaiji
    We are tring to install SQL Server 2005 Standard on Windows 2008 Standard both 64 bit. Have done this before with no problems. This time we get an error during the installation of client tools: There was an unexpected failure during the setup wizard Link Id 20476, message ID 50000 There are no errors in the event log. Anybody have any ideas what could be wrong?

    Read the article

  • How to manage two separate testing teams using different test tracking tools

    - by newuser
    I have two independent testing teams currently testing the same application. One team is using ClearQuest, and the other is using Mantis. It has been a huge effort to manage all of the duplicate reported bugs. What options would improve this situation? My constraint is that the ClearQuest team will not change test reporting tools. The migration to ClearQuest also comes with a large training effort.

    Read the article

  • Rack layout tools

    - by Luke
    I'm wondering if there's any tools (preferably offline) that would allow me to layout all of the new equipment that will be going into several standard racks. Currently I'm using Excel to map out all of the slots columns for the data but I suspect that there is some better method of doing this. Suggestions? Edit: Dell has an online tool, but doesn't seem very good at actually saving the data that you're working on (and obviously it's geared towards Dell hardware).

    Read the article

  • Cannot install PC Tools firewall

    - by Philip
    I am unable to install PC Tools firewall..It is fine up until after rebooting then the PcTools icon in the system tray hangs and indicates it is "initializing." I have waited 5 minutes and no change. I have tried multiple times.The Vista firewall was turned off prior to attempting to the attempted installation. Help!Thank You

    Read the article

  • Rack layout tools

    - by Luke
    I'm wondering if there's any tools (preferably offline) that would allow me to layout all of the new equipment that will be going into several standard racks. Currently I'm using Excel to map out all of the slots columns for the data but I suspect that there is some better method of doing this. Suggestions? Edit: Dell has an online tool, but doesn't seem very good at actually saving the data that you're working on (and obviously it's geared towards Dell hardware).

    Read the article

  • Meaning of Crawl errors

    - by com
    My question is about definition of Crawl errors in Google Webmaster Tools. Crawl errors is devided into few sections. Let's first consider HTTP section. I assume that all broken links in this section was somehow found by crawler, this is not the links from sitemap. If all this links was found by scanning pages from sitemap for links, why it doesn't mention what was the source page, like in sitemap section with column Linked From. Please correct me if I am wrong. Sitemap section. Looks like all those links came from my sitemap. But there is Linked From column, I already know, that all those broken links is from sitemap, so in order to fix the error, I should revise my sitemap. Am I wrong? Not followed section. I don't know what does it mean. Looks like it accumulates all links that caused redirect, but for some reason Google considers all those redirect as wrong redirect. Do you know if there are any set of rules how to determine wrong redirect. Actually I found were was my mistake, I tried to normalize URL and redirect it to the right URL, but I did normalization in a wrong way. Not found section. This section like HTTP section but with 404 errors. This section has Linked From column. But very often Linked From has unavailable. What does it mean, Google can not say me how it found this non existing page. How this section related to sitemap section. Does this section contains all 404 links from sitemap too. But there is too many 404 links, much more than in sitemap. I tried to take a look what we have in Linked From, and I saw that this link came from sitemap two month ago. But why Google keeps it indexed, the link is already dead, new sitemap doesn't have it. If there is any expire date for old links? Unreachable section. Looks like this section for 500 errors. This section doesn't contain Linked From column. There are too many completely meaningless links, I really don't know where this stuff came from, and without Linked From I am not able to figure out how to deal with it. Sorry for such a big topic, but I just want to make it clear, what every section stands for, because it's extremely crucial in order to deal with all those problems. Hopefully it will be useful not just for me. Thanks!

    Read the article

  • GWT: Generate more complete crawl error report

    - by Mike
    I'm a developer in charge of managing Webmasters and related issues (including correcting crawl errors) for dozens (hundreds, maybe?) of active sites and as part of my duties I create a report of every discrepancy, including all pages generating a 404 and all pages that link to those pages. Currently within Webmaster Tools I'm able to download a csv file of all pages with a 404 response, but I'm then having to manually click on every single one of those links and copy the "linked from" field to paste into my spreadsheet. This is extremely tedious and seems unnecessary; I would expect the ability to download all that data at once. I'm ultimately looking for the end result of one csv file that has every url with a 404, but also has every url that links to each one of them. Am I overlooking this functionality somewhere or does anyone have a good solution? Edit 1 (2/11/2013): Example of what the csv output looks like now: URL,Response Code,News Error,Detected,Category http://www.abcdef.com/123.php,404,,11/12/13,Not found http://www.abcdef.com/456.php,404,,11/12/13,Not found Which is great, but let's say 123.php has 5 pages that link to it. Now I have to duplicate that row in my spreadsheet 4 more times, then go into Webmasters, get all the url's that link to the page, and add that data to my spreadsheet. The output I would prefer: URL,Response Code,Linked From,News Error,Detected,Category http://www.abcdef.com/123.php,404,http://www.ghijkl.com/naughtypage1.php,,11/12/13,Not found http://www.abcdef.com/123.php,404,http://www.ghijkl.com/naughtypage2.php,,11/12/13,Not found http://www.abcdef.com/123.php,404,http://www.ghijkl.com/naughtypage3.php,,11/12/13,Not found http://www.abcdef.com/456.php,404,http://www.ghijkl.com/naughtypage1.php,,11/12/13,Not found http://www.abcdef.com/456.php,404,http://www.ghijkl.com/naughtypage2.php,,11/12/13,Not found http://www.abcdef.com/456.php,404,http://www.ghijkl.com/naughtypage3.php,,11/12/13,Not found Note the (hypothetical) addition of a "Linked From" column, as well as the fact there are only 2 unique URL's now (like before) but all of the "Linked To" pages are shown in one report. Edit 2 (2/12/2013): To clarify, my question is less about detecting and correcting 404's, but more about generating a report of what Google has listed as errors. Oftentimes, these errors aren't even valid anymore but I still need documentation to show that Google detected a problem and that problem is now fixed. Many of the "linked from" url's I find are actually outdated, cached resources. For example, I'll frequently see that the linked-from url is the sitemap, which is actually an old sitemap cached by Google that points to an old page. Neither the sitemap or old page exist, but they still appear in my crawl error reports because they are cached resources.

    Read the article

  • Network and Server Management Tools

    - by jessieE
    We are building a farm of test servers. Currently we have 8 servers. We are planning to use the servers to test the following Mysql Cluster Xen or KVM virtualization Heartbeat/Pacemaker/DRDB What tools do experienced sysads use for: Initial installation of operating system( installing centos 5 or ubuntu server manually 8 times seems like a tedious task that just begs for automation) Centralized Configuration Management and Software Updates for Host and possibly Guest(virtualized) servers Hardware, Services and Network Monitoring

    Read the article

  • Linux tools to choose suitable Cisco ASA 5500

    - by linuxcore
    I have a linux webhosting server which affects a high DDOS. I want to use Cisco ASA 5500 Series Adaptive Security Appliances to protect the linux server from this DDOS. I know there are many factors should you know before you choose the suitable hardware firewall like the amount of this DDOS and pps ..etc Please suggest a linux tools to measure those factors and to help me collect the required informations ( pps - amount of DDOS - concurrent connections and other factors ) Regards,

    Read the article

  • Edit Top 200 Rows SQL Pane equivalent in Visual Studio 2012 SQL Server Tools "View Data"

    - by Johan Kronberg
    I've always used Edit Top 200 Rows and then edited the query in the SQL Pane of the 2008 Management Studio to find the rows I want to edit data for. Now I have the tools inside Visual Studio 2012 and want to use be able to change the query after right clicking a table and choosing "View Data" but I can't see that this is possible. Has the "SQL Pane" feature been removed or am I not seeing something?

    Read the article

  • Bacula optimization/profiling tools

    - by pufferfish
    I'm trying to get an idea of where the bottlenecks are in our backup system. Are there tools available for profiling this? If not, any pointers to a home grown method would also help. I guess most of the info would be in the bacula logs, but I'd also like to see things like what gets saturated during despooling: disk, CPU or network? This feels like a problem most bacula admins would have encountered.

    Read the article

  • use Amazon EC2 tools in Capistrano to get the servers to push the code

    - by APZ
    I am trying to use EC2 tools to get all the machines with a particular tag in some type of array in /config/deploy/prod.rb file in Capistrano. Something like this: In prod.rb file: //untested command workers-array[]=$(ec2-describe-instances -F vpc-id=1234 -F tag:Env=prod -F tag:SystemType=worker) for(i=0;i<workers-array.len;i++){ role :worker-A, workers-array[i] } I am not sure how we can do this in capistrano, am new to ruby too. Guys any help on this would be really appreciated.

    Read the article

  • Removing Microsoft Visual Studio 2010 Tools for Office Runtime (x64)

    - by helloworld922
    I'm trying to remove the Microsoft Visual Studio 2010 Tools for Office Runtime (x64). It says it's uninstalled when I try to remove it from Control Panel->Programs and Features, but the list item is still there, and if I try to remove it again, it brings up the prompt to install it. How do I get rid of this program completely (or at least remove it from the add/remove list)? I don't have Microsoft Office installed (or Visual Studio).

    Read the article

< Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >