Search Results

Search found 502 results on 21 pages for 'outdated'.

Page 11/21 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Double Click to open Office docs is slow, File -> Open is fast.

    - by Keith
    I have 2 unique networks. They both share similar architecture: Windows 2003 SBS SP2 Running Symantec Endpoint Running Symantec Information Foundation Shared drives off a data partition Clients running Office 2003 or 2007 Connect to file server through mapped drives When users try to open a file from their local PC by double clicking, it will take 30-60 seconds to open. When they do File - Open, those same documents open up almost immediately. So far I've tried the following - CCleaner to parse the registry of outdated mapped drives - Disabled "using DDE" - Disabled A/V - Reboot Any ideas beyond that? Figured this question belongs here instead of SU since its the same issue on different networks.

    Read the article

  • Is there an easy way to always redirect facebook.com to facebook.com/events/list (Firefox, Osx)

    - by Tor Thommesen
    I often use facebook for communication while working. When I do, I'd rather not see the facebook newsfeed. Unfortunately it's hard to remember to not go to the frontpage and use facebook.com/messages or facebook.com/events. Is there any way to always redirect from the url https://www.facebook.com/ to https://www.facebook.com/events/list? I use firefox on osx. I have tried the redirector addon and I'm not able to make it work, not sure if I'm missing something obvious or if it's outdated/buggy.

    Read the article

  • Is there still a place for tape storage?

    - by Jon Ericson
    We've backed up our data on LTO tapes for years and it's a real comfort to know we have everything on tape. A sister project and one of our data providers have both moved to 100% disk storage because the cost of disk has dropped so much. When we propose systems to potential customers these days we tend to downplay or not mention our use of tape systems for data storage since it might seem outdated. I feel more comfortable with having data saved in two separate formats: disks and tape. In addition, once data is securely written to tape, I feel (perhaps naively) that it's been permanently saved. Not having to rely on a RAID controller to be able to read back data is another plus for me. Do you see a place for tape backup these days?

    Read the article

  • Malzilla Tutorial #4 - how to get it working

    - by Sim
    I am trying to understand the basic work flow of using Malzilla , and therefore doing the tutorials. But already in the first one, where you actually get to do something I am lost as it doesn't work for me at all, even though I follow it step by step. In the first paragraph where you have to concatenate the string and then translate it using the MiscDecoders->DecodeUCS2, I am getting not printable chars as a solution, after concatenating and using the decoder. In the second paragraph where you are told to run the script after adding the function-call at the bottom of the script, the Decoder tells me that he cannot handle the arguments.callee.toString() method and therefor wont run the script. How outdated are those tutorials and how can I get them to work?

    Read the article

  • Rsync and Windows 7

    - by Nate
    Can someone give me any tips on setting up some sort of Rsync server/client on Windows 7 to run rsync between both my web hosting server, and a backup server that I have running Ubuntu? I've tried setting it up with this tutorial: http://www.youtube.com/watch?v=CvwdkZLNtnA Using copssh, and cwrsync. Ran into all sorts of troubles, including not being able to get cwrsync to run (it installs properly, but never starts up), and copssh not generating the keys at all. The guy was running Windows Server 2003, though, so I'm guessing the problems could just be because I'm running Windows 7. I've been trying to set it up with my Windows machine being the rsync server, and then Ubuntu and my webhosting VPS as the clients, but I realize it may be easier (and make more sense) to just setup the rsync server on Ubuntu, and then an rsync client on Windows 7? Can anyone point me in the right direction? I'm thinking of using this guide: http://www.gaztronics.net/rsync.php It seems a bit outdated, though.

    Read the article

  • windows hosted networok on windows 8

    - by tanmaysingh
    my sister is using Acer laptop, that has an intel WLAN card and supports windows hosted network(checked in command prompt).Am using the basic way using netsh wlan command also, then am using netsh start wlan command.The hosted network gets created and it could b seen in "Network and Sharing Center" as the . I also went to Device manager, selected WLan card, made sure"allow this device to wake computer from sleep" is selected. Now I went to the newly created , clicked properties, and under sharing enabled allow this to be shared(but I am not able to select The name of the wifi my sis wants to share (in this case her college wifi name, LEts say QP4,Ethernet and Local Area connection are the only two option that are coming). Also the SSID name is not being shown in her mobile nor when I click the tower icon at the tray area,Any suggestions on what might be going wrong?She is using windows 8. Should I look for updated drivers? Her laptop is only 6 months old, i don't think the drivers are outdated. Any advice shall be appreciated.

    Read the article

  • How to force 640*480@60Hz screen resolution on xubuntu 12.04

    - by c2h2
    It seems xubuntu won't be able to correctly set resolution at 640*480@60Hz at its Display settings. And I am unable to correctly my super small 6.4 inch Mitsubishi VGA panel via VGA cable. I have tried to hack both X11 conf /etc/X11/Xorg.conf and xfce4 conf, but all the document I can find is outdated. and conf files are changed into other location. Can someone give me a hand and I'll mark correct for other people to use? Thanks! EDIT: The board is an Intel Atom D2700, gpu is SGX545. I tried to use xrandr --output default --mode 640*480 It seems works fine, but refresh rate is 75Hz, but the screen only suports 60Hz So I used xrandr --output default --mode 640*480 --rate 60 but it give error: xrandr: Failed to get size of gamma for output default Can anyone pointing any directions?

    Read the article

  • How to correctly handle redirect after site facelift

    - by Stefan
    I recently updated our site taking it from a multi-page site to a single page site. The problem now is that when the site is searched in say Google, it displays the site as well as the indexed pages. So if a user clicks say our "About" page, it takes them to our now outdated material. I am hoping to get some guidance on how to properly handle this. I figure the first step is to now setup a robots.txt on our new index page to tell the engines not to crawl beyond index.php. But in the meantime, how do I handle the fact that when searching our site on Google we may still have users who try to click on sub-page links? Should I simply setup redirects while waiting for the engines to update? And if so, do I need to setup redirects on each page using PHP or is this something I would take care of on our sites control panel? I am not very familiar with redirects... Any help is appreciated!

    Read the article

  • Exchange 2010 sending old Out of Office message

    - by Tatas
    We are just about done with our migration from Exchange 2003 to 2010. I have a user who has been migrated to the new system that is now out on medical leave. He has gone into OWA and set up his Out of Office notification. The good news is that an Out of Office message is sent, the bad news is that it appears to be sending an old outdated message from back when the user was on Exchange 2003, and not honoring the new message set up in OWA. The user has also tried setting this up in Outlook 2010 as well with the same behavior. I have a feeling that this is related to the old public folders (didn't they contain OOF messages?) still lingering around our Exchange org. Any ideas?

    Read the article

  • Firefox Task Manager [closed]

    - by chris Frisina
    CLOSED!??? I SPECIFICALLY said for OSX, and the previousquestionprovides answers for Windows and other addons are for older versions of FF, or do not answer the question. This is a specific question for OSX. Chrome has a built in task manager under "Tools" to look at the process within Chrome, extremely beneficial in locating a stuck page. What is the similar feature in firefox? I need to identify what is causing it to run slowly, and the addons they suggested dont work for OSX or are outdated, and Chrome's about:memory isn't recognizing FF FF Version 17.0, OSX version 10.8.2 Build 12C3006 How to get to Task Manager: Task Manager: THIS IS NOT A DUPLICATE OF: Possible Duplicate: “Task Manager” addon for Firefox?

    Read the article

  • Should I be learning Linq, Direct SQL Commands (in .net), EF or other?

    - by Wil
    Basically, I have a very good knowledge of plain old SQL coming from Classic ASP programming. Over the past couple of months, I have been learning C# and today was my first full day at MVC 3 (Razor) which I am loving! I need to get back in to Databases and I know that writing SqlCommand everywhere is obviously outdated (although it is nice I can still do it!). I used to go to a great usergroup as an IT Pro and the developer stuff went completely over my head, however I do remember a few things which kept coming up such as LINQ... However, that was some time ago and now the same people on Twitter are saying how out dated it is. I have tried to do research on both and I am clueless as to what direction I should go in, or when to use one over another (if learning both is a good thing). I am more so confused as I thought EF was a part of the .Net Framework, however, reading through the quick start guide, I had to download a component using Nuget. ... Basically I am out of my depth here and just need some honest advice of where to go!

    Read the article

  • When NOT to use a framework

    - by Chris
    Today, one can find a framework for just about any language, to suit just about any project. Most modern frameworks are fairly robust (generally speaking), with hour upon hour of testing, peer reviewed code, and great extensibility. However, I think there is a downside to ANY framework in that programmers, as a community, may become so reliant upon their chosen frameworks that they no longer understand the underlying workings, or in the case of newer programmers, never learn the underlying workings to begin with. It is easy to become specialized to a degree that you are no longer a 'PHP programmer' (for example), but a "Drupal programmer", to the exclusion of anything else. Who cares, right? We have the framework! We don't need to know how to "do it by hand"! Right? The result of this loss of basic skills (sometimes to the extent that programmers who don't use frameworks are viewed as "outdated") is that it becomes common practice to use a framework where it is not required or appropriate. The features the framework facilitates wind up confused with what the base language is capable of. Developers start using frameworks to accomplish even the most basic of tasks, so that what once was considered a rudimentary process now involves large libraries with their own quirks, bugs, and dependencies. What was once accomplished in 20 lines is now accomplished by including a 20,000 line framework AND writing 20 lines to use the framework. Conversely, one does not want to reinvent the wheel. If I'm writing code to accomplish some basic, common little task, I might feel like I am wasting my time when I know that framework XYZ offers all the features I am after, and a whole lot more. The "whole lot more" part still has me worried, but it doesn't seem that many even consider it anymore. There has to be a good metric to determine when it is appropriate to use a framework. What do you consider the threshold to be, how do you decide when to use a framework, or, when not.

    Read the article

  • Rythmbox in Ubuntu 12.04 and Ipod touch

    - by leousa
    I don't know if anyone else is experiencing something similar to this. I have an Ipod touch bought 3.5 years ago. It received the last software update a year ago. After that update I had no problems syncing it in Ubuntu 11.04 and 11.10 (two different laptops). Transfer of music albums was flawless with banshee and it would even convert files to the right format automatically. Now, the same ipod touch, without any further software or firmware update does not work in Ubuntu 12.04 rythmbox. The device is mounted and recognized there, but when you drag/drop an album towards the device it does nothing. Before you guys tell me to do it, yes, I have tried with Banshee and gtkpod. The first also brings up an error message saying that the ipod does not support mp3 files, and gtkpod simply crashes all the time. The result is the same. What is going on here? Why a device that worked before does not work now in 12.04? I purchased some music in the Ubuntuone store, and would love to have it transferred to my ipod. Please no links to outdated online manuals with older versions of rythmbox or banshee (I've read them all). And again, nothing wrong with the ipod as it worked in 11.04 and 11.10 and it has not been updated since then. I would strongly appreciate any help provided. thank you

    Read the article

  • Request Removal of naked domain from Google Index

    - by Pedr
    I have a site which was temporarily available at both example.com and www.example.com. All traffic to example.com is now redirected to www.example.com, however during the brief period that the site was available at the naked domain, Google indexed it. So Google now has two versions of every page indexed: www.example.com www.example.com/about_us www.example.com/products/something ... and example.com example.com/about_us example.com/products/something ... For obvious reasons, this is a bad situation, so how can I best resolve it? Should I request removal of these pages from the index? There is still content at these URLs, but they now redirect to the www subdomain equivalent. The site has many hundreds of pages, but the only way I can see to request removal is via the Remove outdated content screen in Webmaster Tools, one URL at a time. How can I request removal of an entire domain (ie. the naked domain) without it effecting the true site located at the www subdomain? Is this the correct strategy given that all the naked domains now redirect to their www equivalent?

    Read the article

  • I just received a complaint from a user of the website I maintain. Should I do anything?

    - by Chris
    I was sent sent a large wall of text from a user of the website I maintain at my job. They are clearly upset for having to deal with a horribly outdated web application that has not seen any serious updates in over 6+ years. No refactoring has been done, the code quality is terrible, the security unchecked, policy compliances ignored, in addition to being ugly and frankly embarrassing. Keep in mind this is a small business but the website is used by hundreds daily. I'm one of two programmers there, and I've been working there for two years. This person says they are about my age (22) and understand technology (but can't use proper grammar). The complaint mentioned awkward pages and actions on the website, but they don't even have a clue as to the depth of the flaws in this website. Now, I would love to honestly tell them that there's a lot wrong with this company and that this application was built when we were in high school. And that while it's not my fault that the website is terrible, I'm the one in position to fix it. But on the other hand, I could just say nothing and ignore it. Would doing this publicly have any advantage to future employees (showing integrity) or would it just be a completely pointless mistake? Odds are, even if I respond only that one person will ever read it. Regardless, I'm probably just going to ignore it and continue starting my project to refactor the website.

    Read the article

  • install clamav from surceforge

    - by maria
    I'm using Ubuntu 10.04. I'm affraid I've installed a virus on my computer (I've clicked a link which was a virus), I'd prefere to check if everything is fine. I've installed clamAV using Synaptic. The installed version was 0.96.5, while the most recent version is 0.97.6. When I was trying to update virus database, I got a warning that the version of clamAV I'm using is outdated. Since I didn't know how to update the software by Synaptic or apt (both show the installed version as the newest) I've uninstalled all and downloaded the recent version from sourceforge. I've unpacked the tar.gz archive, entered the folder but when I type ./configure I'm getting the message: configure: error: Please install zlib and zlib-devel packages When I type sudo aptitude install zlib zlib-devel The terminal output says there is no such packages (there is no package zlib-devel and there is many packages which name contain zlib). I suppose the link contained probably the Windows virus, but I'd prefere to make sure there is nothing in my computer. As well as I'm not sure if the virus, even without harming my system, can send itself to my e-mail contacts or not.

    Read the article

  • Which universal or driverless printing solution do you use/recommend?

    - by Matt
    I'm in need of a driverless printing solution for Microsoft Terminal Services 2003/2008. This is mainly to support clients who are connected through broadband into our hosted servers. We were hoping that MSTS 2008 thinprint would be the answer but unfortunately it performs poorly in the print area. The files are too large. I found the following slightly outdated URL: http://www.msterminalservices.org/software/Printing/ This lists a number of products but I have no experience with any of them. I'd like a product that works/easy to install (as our clients are remote and not particularly tech savvy) and ideally I just pay for the server license and not every clients. What is your experience/recommendation and tips you can offer me in regards to TS printing? thanks in advance.

    Read the article

  • How to make sure you server NIC performance is at best on Windows?

    - by Bobb
    I realised that I followed some obscure paper on setting NICs on Windows for too long. It might be outdated with new hardware released in past couple of years and with W2008R2. I read a bit about offloading and RSS settings on Windows and I realiased that it is very much circumstantial. Noone can really say - enable that and disable this. etc. So what I really want is for my next server try and setup testing environment and measure how my particular application will behave with different settings. The target is going to be latency of TCP primarily. Please note I am talking about latency inside the box. Are there precision tools for Windows to measure latency (down to microseconds)? P.S. I know this is not easy question. Windows time drift is awful problem for any precision test but still I am sure I am not the fist person to need that... Please share your experience

    Read the article

  • How to make sure clients update their browser cache when my website is updated?

    - by user64204
    I am using the HTTP 1.1 Cache-Control header to implement client-side caching. Since I update my website only once a month I would like the CSS and JS files to be cached for 30 days with Cache-Control: max-age=2592000. The problem is that the 30-day period defined by Cache-Control doesn't coincide with the website update cycle, it starts from the moment the users visit the site and ends 30 days later, which means an update could occur in the meantime and users would be running with outdated content for a while, which could break the rendering of the website if for instance the HTML and CSS no longer match. How can I perform client-side caching of content for periods of several days but somehow get users to refresh their CSS/JS files after the website has been updated? One solution I could think of is that if website updates can be schedule, the max-age returned by the server could be decreased every day accordingly so that no matter when people visit the website, the end of caching period would coincide with the update of the website, but changing the server configuration every day goes against one of my sysadmin principles (once it's running, don't touch it).

    Read the article

  • How to auto-update a website mirror with exceptions to certain pages?

    - by tomatosalad
    I'm currently mirroring a website on my server. The site itself is rarely updated, but it is updated enough that info can become outdated quickly. I mirrored it first with wget, and this worked fine, but I made some changes: The original index.html used frames, but the site also provides a main.html which is essentially index.html but without frames. I deleted index.html and renamed main.html. I did not want to mirror the webchat, blog or forum, so I deleted those files and directories and made directories "blogs" "forum" and "chat" and placed a php redirect in each of those, redirecting visitors to the orignal site. I'd like to auto-update the mirror (maybe once every 24-72 hours), but preserve the changes I made. Is this possible? How would I go about doing it? I am completely clueless as to how. Thanks for any and all help! :)

    Read the article

  • DVI monitor detected only on computer startup

    - by kamil
    I've recently connected a new monitor, LG M2252D-PZ, to a rather outdated computer with Windows XP and Radeon 9600. XP has SP3 installed, video drivers are the latest version back from the times the video card was still supported. My problem is that the monitor works fine only as long as I don't turn it off or switch it to a different input. When I turn it back on, it says "no signal". The key to the problem must be the DVI port, to which the new monitor is connected. The previous monitor was connected to the VGA output, and I've tested that the new one also works fine when connected to the analogue port. Apparently, the computer tests for the presence of a monitor on the DVI port only on startup. The question is, how do I change this?

    Read the article

  • Upgrading Blog to WordPress, keep old one or redirect?

    - by Spazm
    I have had a blog for around 4 years now that has gained some success and gets tons of residual and organic traffic. I am using an outdated version of BlogEngine.NET and we are going to switch to WordPress. Currently, our blog is at ourwebsite.com/blog, and I don't really want to mess with our web server. I setup a new server just for WordPress and instead of dealing with proxies and things that are above my head, the new blog will be at blog.oursite.com. I am trying to figure out the best way to go about doing this. We value our search engine rankings very much so, as that is where 90% of our traffic comes from. Would I be best off: Importing all of the old blog posts from oursite.com/blog to blog.oursite.com and then redirecting all of the articles, categories, authors, pages and tags to the new blog and giving up the direct links to our current site. Keeping all of our current articles as they are, and only add new content to the new blog.oursite.com and just kind of let the old blog float around our site.

    Read the article

  • Can I compile mutt under cygwin?

    - by openist
    I've been trying to compile mutt under cygwin for a few days. The included version is outdated and does not include things I need like header caching. Anyways, I always get the message: "configure: error: no curses library found" I have all the curses + devel stuff installed + termpcap, which I heard might be related. I've tried re-installing, i've tried specifying the location on the configure command line, but i'm not sure i'm doing it right: "--with-curses=/usr/lib/libncurses.a --with-curses=/usr/lib/libncurses.dll.a --with-curses=/usr/include/ncurses" Here's my config.log: http://floatsolutions.net/docs/config.log Any ideas? EDIT: Context

    Read the article

  • How do I make stunnel verify a clients certificate?

    - by unixman83
    NOTE: The title is misleading. Please correct it if you know a better title. What I want to know is how do I create the SSL keys / certificates needed for this. Hi. I am using stunnel to authenticate RDP (Remote Desktop) and I need to verify that a client possesses the proper credentials. So people cannot brute force into the machine. I am also using a bad (outdated) version of RDP that has security vulnerabilities, so stunnel is a must. I will preshare the necessary .pem's between machines. What are the openssl commands I need to create the right .pem files on both the client and on the server? What files need to be shared?

    Read the article

  • Best setup/workflow for distributed team to integrated DSVC with fragmented huge .NET site?

    - by lazfish
    So we have a team with 2 developers one manager. The dev server sits in a home office and the live server sits in a rack somewhere handled by the larger part of my company. We have freedom to do as we please but I want to incorporate Kiln DSVC and FogBugz for us with some standard procedures to make sense of our decisions/designs/goals. Our main product is web-based training through our .NET site with many videos etc, and we also do mobile apps for multiple platforms. Our code-base is a 15 yr old fragmented mess. The approach has been rogue .asp/.aspx pages with some class management implemented in the last 6 years. We still mix our html/vb/js all on the same file when we add a feature/page to our site. We do not separate the business logic from the rest of the code. Wiring anything up in VS for Intelli-sense or testing or any other benefit is more frustrating than it is worth, because of having to manually rejigger everything back to one file. How do other teams approach this? I noticed when I did wire everything up for VS it wants to make a class for all functions. Do people normally compile DLLs for page-specific functions that won't be reusable? What approaches make sense for getting our practices under control while still being able to fix old anti-patterns and outdated code and still moving towards a logical structure for future devs to build on?

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >