Search Results

Search found 35433 results on 1418 pages for 'document based'.

Page 440/1418 | < Previous Page | 436 437 438 439 440 441 442 443 444 445 446 447  | Next Page >

  • Centralized Windows/Mac Patch Management that is easy to use

    - by BiggsTRC
    I'm looking for advice on what patch management solutions you would recommend based upon your experience. I'm also looking for which ones you would not recommend based upon your experience. We have a mixed network of Windows and Mac clients. Our central servers are all Windows servers, although I have considered putting in a Mac server to better handle our Mac clients. The issue we are facing currently is that we need to maintain the patches on all of our third-party applications. Right now we use WSUS, which handles with patching of Windows and some Microsoft products but that is about it. I need something to cover the other applications, specifically things like Adobe products (Reader, Flash, Dreamweaver, etc.) Our network isn't that big (maybe 200 clients) and I don't have a person to dedicate just to patching and maintaining a patch management solution. Thus very large and complicated solutions like System Center are most likely out. I have recently been looking at Dell's Kace K1000 solution (http://www.kace.com/products/systems-management-appliance/). It seems simple and it provides a lot of tools in one package that I would like/need as well. I like the fact that it is self-contained in an appliance and that it is designed for solutions like mine. However, I'm not sure if this is the best solution. I've also looked some at Shavlik's Netchk solution (http://www.shavlik.com/netchk-protect.aspx) but I don't need an anti-virus product. However, it looks like they might have a very good patch database. My question is this: What are your thoughts on these to products? Are there better products out there? Are there issues that I'm not considering? I want something that is very good at patching a broad range of products, that is simple to use, that takes a minimal amount of management (like WSUS), and that (hopefully) works with Mac and Windows.

    Read the article

  • CheckPoint/Amazon VPC VPN tunnel working inconsistently

    - by Lee
    First time poster, so please be gentle and correct me if there's Server Fault etiquette I'm missing. We have two CheckPoint edge devices at sites A & B, independently managed, connecting to two Amazon private clouds. In both cases, the two Amazon VPCs are in the same community on the CheckPoint device. A VPN tunnel exists between the two CheckPoint devices as well. Between Sites A & B and the Amazon VPC in Northern Virigina, we are unable to keep more than one tunnel up. Both will come up, but tunnel 2 will drop an hour after initiation and will not come back up while tunnel 1 is up. We believe the 1-hour period is due to IPsec phase 2 renegotiation, but can't be sure. On our side, we see the tunnel 2 remote endpoint as not responding to phase 2 negotiation. Between Sites A & B and the Amazon VPC in Oregon, we have no issues. Both tunnels are up and fail over properly. The CheckPoint gateways are using domain-based VPNs. According to CheckPoint's advice to Amazon, this won't work. Yet, in Oregon, it does. We've pursued this with Amazon and, despite the fact it's working in Oregon, they've refused to troubleshoot with us further. Can anyone suggest anything we can do to try to get this stabilized? Going to route-based VPNs is not an option for us.

    Read the article

  • Follow cursor location from kile to evince.

    - by D Connors
    I know the title is probably not very clear, so I'll try to be as clear as possible here. I'm running xubuntu on my netbook, and I'm using kile for my latex editing. Since kile is native to kde, I had to manually set it to open pdfs and dvis on evince instead of okular. Now, last time I played around with LaTeX I was using TeXnic center on windows, and it had a very neat feature. Whenever I hit "QuickBuild", not only would it open the output .dvi file, but it would also show me exactly the piece of text I was editing. That is, if I were editing line 13 of the 7th of my document, when I compiled it, the dvi viewer would automatically take me to line 13 on the 7th page of the document, so I wouldn't have to scroll all the way down to it every time I compiled the .tex file. I'm guessing this is a pretty standard feature, and kile probably supports it. But since I don't know what it's called, I'm trying to be clear as to what I'm talking about. Problem is, this feature is not working for me right now, and I'm guessing it's either because evince does not support it, or because I have to manually configure it. Which one is it? And how do I manually configure it, if that's the case?

    Read the article

  • Help transferring Mac OS X fonts to Windows

    - by Addsy
    I have been sent a QuarkXPress document which contains a number of unusual fonts which were also sent to me. The fonts were in a folder named fonts and then it was all zipped up into a single archive containing the document and fonts folder. I believe this was done on a machine running Mac OS X. I have unzipped the archive, but all of the files in the fonts folder have no size - i.e. they are 0 bytes files. However there is also a _MACOSX folder which has a fonts subfolder. In here I can see all the font files (prefixed with .) and they all have a size, i.e. are not 0 bytes. Presumably these are the actual font files I need but I cannot figure out how to install them on Windows. I have tried renaming the files to have .ttf and .otf extensions and then installing them but that doesn't seem to work - I just get an "Invalid font file" message. Any other ideas?

    Read the article

  • Windows 2008 Standard upgrade to Windows 2008 Enterprise failure

    - by Archit Baweja
    Sidestory, I was in the process of setting up a second Exchange 2010 server for DAG support, when I realized that my box needed Windows 2008 Enterprise edition. The box currently has Windows 2008 Standard Windows update including SP2 Exchange 2010 with CAS, HT, Mailbox roles Domain Services role File Services role. When I try to upgrade to Windows 2008 Enterprise, I initially got a "your current version of windows is more recent than the intallation media", something to that effect. My first guess was it may be SP2 related, so I uninstalled SP2, restarted and tried again. This time it gave me an error to the effect Windows could not configure one or more windows components. Please restart and try the update again. This was at the last stage of the Windows 2008 Enterprise install when it says "Completing installation". So I removed Domain Services role (including demoting it as a DC). However I get the same error again. Anyone see something like this before and have any suggestions? Also , is there a log file the windows upgrade program spits out that I can consult to see what component exactly is interfering? Update 1 Based on some googling I finally found the setup log file, and it seems that Windows setup had an issue determining the .Net 3.0 "feature" being installed or uninstalled. So based of of a win7/vista technet article I'm going to retry the upgrade after removing the .Net 3.0 feature.

    Read the article

  • Why is Windows Update trying to install an update I don't need?

    - by Oliver Salzburg
    I have a Windows 7 system that currently has a single update pending: Windows Internet Explorer 9 for Windows 7 for x64-based Systems If I try to install the update, Windows Update will: Create a restore point Fail with the error: Code 9C48 Windows Update encountered an error. The event log for the event reads: Installation Failure: Windows failed to install the following update with error 0x80070643: Windows Internet Explorer 9 for Windows 7 for x64-based Systems. If you search the web for that error, there are many other people with the exact same issue. Sadly, I am unable to apply the proposed solutions to my case, because I just installed this system. There is nothing on it, except Windows 7. I installed the system and ran through the updates. I also did the exact same process with this machine several times over the past few days due to a long-term test we just started. I didn't have any problems with any Windows Update on the previous installation runs and I know I didn't do anything different this time because I followed the installation procedures instructions which are to be used during the test. How did this happen and how do I solve it? Further Investigation So, as I always like to do, I ran the update again while running Process Monitor and dug up further details. WindowsUpdate.log First of all, there is a Windows Update log file located at C:\Windows\WindowsUpdate.log which I didn't know about. But I fail to see any significant entry in it, maybe you're more lucky: 2012-04-10 22:46:58:017 956 728 AU AU received approval from Ux for 1 updates 2012-04-10 22:46:58:017 956 728 AU AU setting pending client directive to 'Progress Ux' 2012-04-10 22:46:58:095 956 728 AU BeginInteractiveInstall invoked for Download 2012-04-10 22:46:58:095 956 728 AU Auto-approving update for download, updateId = {B33ACEC1-3265-4D01-9C37-AC0892E95ED9}.100, ForUx=1, IsOwnerUx=1, HasDeadline=0, IsMinor=0 2012-04-10 22:46:58:095 956 728 AU Auto-approved 1 update(s) for download (for Ux) 2012-04-10 22:46:58:110 956 728 AU UpdateDownloadProperties: 0 download(s) are still in progress. 2012-04-10 22:46:58:110 956 728 AU ############# 2012-04-10 22:46:58:110 956 728 AU ## START ## AU: Download updates 2012-04-10 22:46:58:110 956 728 AU ######### 2012-04-10 22:46:58:110 956 728 AU # Approved updates = 1 2012-04-10 22:46:58:110 956 728 AU AU initiated download, updateId = {B33ACEC1-3265-4D01-9C37-AC0892E95ED9}.100, callId = {35DF928B-B428-4BAC-8C63-55295967EFBB} 2012-04-10 22:46:58:110 956 728 AU Setting AU scheduled install time to 2012-04-11 01:00:00 2012-04-10 22:46:58:110 956 728 AU Successfully wrote event for AU health state:0 2012-04-10 22:46:58:110 956 728 AU Currently showing Progress UX client - so not launching any other client 2012-04-10 22:46:58:110 956 bb8 DnldMgr ************* 2012-04-10 22:46:58:110 956 bb8 DnldMgr ** START ** DnldMgr: Downloading updates [CallerId = AutomaticUpdatesWuApp] 2012-04-10 22:46:58:110 956 bb8 DnldMgr ********* 2012-04-10 22:46:58:110 956 bb8 DnldMgr * Call ID = {35DF928B-B428-4BAC-8C63-55295967EFBB} 2012-04-10 22:46:58:110 956 bb8 DnldMgr * Priority = 3, Interactive = 1, Owner is system = 0, Explicit proxy = 0, Proxy session id = 1, ServiceId = {9482F4B4-E343-43B6-B170-9A65BC822C77} 2012-04-10 22:46:58:110 956 bb8 DnldMgr * Updates to download = 1 2012-04-10 22:46:58:110 956 bb8 Agent * Title = Windows Internet Explorer 9 for Windows 7 for x64-based Systems 2012-04-10 22:46:58:110 956 bb8 Agent * UpdateId = {B33ACEC1-3265-4D01-9C37-AC0892E95ED9}.100 2012-04-10 22:46:58:110 956 bb8 Agent * Bundles 1 updates: 2012-04-10 22:46:58:110 956 bb8 Agent * {6D9A90B7-FAF9-4A47-9EFE-A506264873B3}.100 2012-04-10 22:46:58:110 956 bb8 DnldMgr *********** DnldMgr: New download job [UpdateId = {6D9A90B7-FAF9-4A47-9EFE-A506264873B3}.100] *********** 2012-04-10 22:46:58:110 956 728 AU Successfully wrote event for AU health state:0 2012-04-10 22:46:58:110 956 728 AU # Pending download calls = 1 2012-04-10 22:46:58:110 956 728 AU ## RESUMED ## AU: Download update [UpdateId = {B33ACEC1-3265-4D01-9C37-AC0892E95ED9}, succeeded] 2012-04-10 22:46:58:313 956 bb8 Agent ** END ** Agent: Downloading updates [CallerId = AutomaticUpdatesWuApp] 2012-04-10 22:46:58:313 956 bb8 Agent ************* 2012-04-10 22:46:58:313 956 718 AU ######### 2012-04-10 22:46:58:313 956 718 AU ## END ## AU: Download updates 2012-04-10 22:46:58:313 956 718 AU ############# 2012-04-10 22:46:58:313 956 718 AU Setting AU scheduled install time to 2012-04-11 01:00:00 2012-04-10 22:46:58:313 956 718 AU Successfully wrote event for AU health state:0 2012-04-10 22:46:58:313 956 718 AU Currently showing Progress UX client - so not launching any other client 2012-04-10 22:46:58:313 956 718 AU Successfully wrote event for AU health state:0 2012-04-10 22:46:58:313 956 aac AU Getting featured update notifications. fIncludeDismissed = true 2012-04-10 22:46:58:313 956 aac AU No featured updates available. 2012-04-10 22:47:00:107 956 aac AU BeginInteractiveInstall invoked for Install 2012-04-10 22:47:00:107 956 aac AU Auto-approving update for install, updateId = {B33ACEC1-3265-4D01-9C37-AC0892E95ED9}.100, ForUx=1, IsOwnerUx=1, HasDeadline=0, IsMinor=0 2012-04-10 22:47:00:107 956 aac AU Auto-approved 1 update(s) for install (for Ux), installType=1 2012-04-10 22:47:00:107 956 aac AU ############# 2012-04-10 22:47:00:107 956 aac AU ## START ## AU: Install updates 2012-04-10 22:47:00:107 956 aac AU ######### 2012-04-10 22:47:00:107 956 aac AU # Initiating manual install 2012-04-10 22:47:00:107 956 aac AU # Approved updates = 1 2012-04-10 22:47:00:107 956 aac AU ## RESUMED ## AU: Installing update [UpdateId = {B33ACEC1-3265-4D01-9C37-AC0892E95ED9}] 2012-04-10 22:47:13:773 2232 9fc Handler : WARNING: Exit code = 0x8024200B 2012-04-10 22:47:13:773 956 718 AU # WARNING: Install failed, error = 0x80070643 / 0x00009C48 2012-04-10 22:47:13:773 2232 9fc Handler ::::::::: 2012-04-10 22:47:13:773 2232 9fc Handler :: END :: Handler: Command Line Install 2012-04-10 22:47:13:773 2232 9fc Handler ::::::::::::: 2012-04-10 22:47:13:851 956 a7c Agent ********* 2012-04-10 22:47:13:851 956 a7c Agent ** END ** Agent: Installing updates [CallerId = AutomaticUpdates] 2012-04-10 22:47:13:851 956 718 AU Install call completed. 2012-04-10 22:47:13:851 956 a7c Agent ************* 2012-04-10 22:47:13:851 956 718 AU # WARNING: Install call completed, reboot required = No, error = 0x00000000 2012-04-10 22:47:13:851 956 718 AU ######### 2012-04-10 22:47:13:851 956 718 AU ## END ## AU: Installing updates [CallId = {FCFF2A5C-25AB-4FB9-AB2B-35C65CCA6A9F}] 2012-04-10 22:47:13:851 956 718 AU ############# 2012-04-10 22:47:13:851 956 718 AU Install complete for all calls, reboot NOT needed 2012-04-10 22:47:13:851 956 718 AU Setting AU scheduled install time to 2012-04-11 01:00:00 2012-04-10 22:47:13:851 956 718 AU Successfully wrote event for AU health state:0 2012-04-10 22:47:13:851 956 498 AU Getting featured update notifications. fIncludeDismissed = true 2012-04-10 22:47:13:851 956 498 AU No featured updates available. 2012-04-10 22:47:14:366 956 168 AU No featured updates notifications to show 2012-04-10 22:47:14:366 956 168 AU UpdateDownloadProperties: 0 download(s) are still in progress. 2012-04-10 22:47:14:366 956 168 AU Triggering Offline detection (non-interactive) 2012-04-10 22:47:14:366 956 168 AU AU setting pending client directive to 'Install Complete Ux' 2012-04-10 22:47:14:366 956 168 AU Changing existing AU client directive from 'Progress Ux' to 'Install Complete Ux', session id = 0x1 2012-04-10 22:47:14:366 956 168 AU Successfully wrote event for AU health state:0 2012-04-10 22:47:14:366 956 b78 AU ############# 2012-04-10 22:47:14:366 956 b78 AU ## START ## AU: Search for updates 2012-04-10 22:47:14:366 956 b78 AU ######### 2012-04-10 22:47:14:366 956 b78 AU ## RESUMED ## AU: Search for updates [CallId = {0198DD3A-D7B0-48F5-A77D-795F8A1BDCE8}] 2012-04-10 22:47:16:097 956 718 AU # 1 updates detected 2012-04-10 22:47:16:097 956 718 AU ######### 2012-04-10 22:47:16:097 956 718 AU ## END ## AU: Search for updates [CallId = {0198DD3A-D7B0-48F5-A77D-795F8A1BDCE8}] 2012-04-10 22:47:16:097 956 718 AU ############# 2012-04-10 22:47:16:097 956 718 AU No featured updates notifications to show 2012-04-10 22:47:16:097 956 718 AU Setting AU scheduled install time to 2012-04-11 01:00:00 2012-04-10 22:47:16:097 956 718 AU Successfully wrote event for AU health state:0 2012-04-10 22:47:16:097 956 718 AU Successfully wrote event for AU health state:0 2012-04-10 22:47:16:113 956 55c AU Getting featured update notifications. fIncludeDismissed = true 2012-04-10 22:47:16:113 956 55c AU No featured updates available. 2012-04-10 22:47:18:780 956 bb8 Report REPORT EVENT: {27479C66-E930-4F9C-AFF2-27EDD76DED8F} 2012-04-10 22:47:13:773+0200 1 182 101 {B33ACEC1-3265-4D01-9C37-AC0892E95ED9} 100 80070643 AutomaticUpdates Failure Content Install Installation Failure: Windows failed to install the following update with error 0x80070643: Windows Internet Explorer 9 for Windows 7 for x64-based Systems. 2012-04-10 22:47:18:780 956 bb8 Report CWERReporter::HandleEvents - WER report upload completed with status 0x8 2012-04-10 22:47:18:780 956 bb8 Report WER Report sent: 7.5.7601.17514 0x80070643 B33ACEC1-3265-4D01-9C37-AC0892E95ED9 Install 101 Unmanaged 2012-04-10 22:47:18:780 956 bb8 Report CWERReporter finishing event handling. (00000000) WU-IE9-Windows7-x64.exe The actual update that is executed is downloaded and stored at the following location: C:\Windows\SoftwareDistribution\Download\Install\WU-IE9-Windows7-x64.exe Executing that file manually, results in the following error message: IE9_main.log The IE9 installer/updater also creates an own log file located at C:\Windows\IE9_main.log For the update session in question, the installer logged: 00:00.000: ==================================================================== 00:00.016: Started: 2012/04/10 (Y/M/D) 23:10:53.897 (local) 00:00.032: Time Format in this log: MM:ss.mmm (minutes:seconds.milliseconds) 00:00.063: Command line: "C:\Windows\SoftwareDistribution\Download\Install\WU-IE9-Windows7-x64.exe" 00:00.078: INFO: Setup installer for Internet Explorer: 9.0.8112.16421 00:00.094: INFO: Previous version of Internet Explorer: 9.0.8112.16443 00:00.110: INFO: Checking if iexplore.exe's current version is between 9.0.6001.0... 00:00.125: INFO: ...and 9.1.0.0... 00:00.141: INFO: Maximum version on which to run IEAK branding is: 9.1.0.0... 00:00.156: ERROR: A newer version of Internet Explorer is already installed on the system. 00:00.188: ERROR: Internet Explorer version check failed. 01:03.789: INFO: Setup exit code: 0x00009C48 (40008) - A more recent version of Internet Explorer is installed. 01:03.820: INFO: Scheduling upload to IE SQM server: http://sqm.microsoft.com/sqm/ie/sqmserver.dll 01:03.852: INFO: SQM Upload returned 403 01:03.867: INFO: Cleaning up temporary files in: C:\Windows\TEMP\IE978E.tmp 01:03.883: INFO: Unable to remove directory C:\Windows\TEMP\IE978E.tmp, marking for deletion on reboot. 01:03.898: INFO: Released Internet Explorer Installer Mutex Which pretty much confirms what the error message says when executing the update manually; it's simply already installed or even obsolete because a newer version is installed. So, why does it try to keep installing the update? Possible solutions? Uninstalling Windows Internet Explorer 9 and manually installing the cached C:\Windows\SoftwareDistribution\Download\Install\WU-IE9-Windows7-x64.exe will result in the same error after applying all pending updates. Applying the FixIt for the issue You receive “0x80070643” or “0x643” error codes when you try to install .NET Framework updates through Windows Update or Microsoft Updates will not resolve the issue. Applying the suggested solution for the issue Error message when you try to install updates by using the Windows Update or Microsoft Update Web site: "0x80070003" will not resolve the issue. Running the FixIt Automatically diagnose and fix common problems with Windows Update does report having resolved issues with Windows Update, but didn't resolve the issue. Running the FixIt for the issue How to troubleshoot Windows Update or Microsoft Update when you are repeatedly offered an update does not resolve the issue. Neither with normal nor with aggressive settings.

    Read the article

  • How to remove all Couchdb versions in Ubuntu 10.04 (server)? ( after multiple installs )

    - by DjangoRocks
    Hi all, I have done multiple installs of CouchDB using sudo aptitude install couchdb sudo ap-get install couchdb and more recently based on the instructions found at L http://wiki.apache.org/couchdb/Installing_on_Ubuntu May I know how do I uninstall or remove all the above installations? Best Regards. +++++++++++++++++++UPDATE++++++++++++++++++++++++ I've tried running the following commands: apt-get remove couchdb apt-get purge couchdb but received the following errors: (Reading database ... 39814 files and directories currently installed.) Removing couchdb ... invoke-rc.d: initscript couchdb, action "stop" failed. dpkg: error processing couchdb (--remove): subprocess installed pre-removal script returned error exit status 1 invoke-rc.d: initscript couchdb, action "start" failed. dpkg: error while cleaning up: subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: couchdb E: Sub-process /usr/bin/dpkg returned an error code (1) May I know how do i fix this? ON issuing the command : dpkg -l | grep couchdb I received the following response: rF couchdb 0.10.0-1ubuntu2 RESTful document oriented database, system D iF couchdb-bin 0.10.0-1ubuntu2 RESTful document oriented database, programs How do i uninstall CouchDB ? I think there's some file corruption?

    Read the article

  • Firebird database corruption causes

    - by Rytis
    I am running several different Firebird versions (2.0, 2.1) on multiple entry level Windows-based servers with wildly varying hardware. The only matching thing between them is that they are running same home built application with the same database structure. Lately I've been seeing massive slowdowns on multiple servers. Turns out that database gets corrupted, so each time it breaks, I get to mend, backup and restore the database, and it all is fine for some time (1-2 weeks), and then it repeats once again. Thankfully, I haven't seen any data loss or damage... yet. The thing is that every such downtime results in lost productivity, and often quite some driving for me as some of the databases are in remote locations. I've been trying to find out what's causing the corruption, but I haven't been able to. The fact that it's running on different hardware hints that it should not be a hardware based problem. If we rule out hardware issues, I have a bad feeling that it's a bug in Firebird as I'm not doing anything fancy via SQL. Do you have any idea how to find out exactly what's causing the corruption and hopefully fix the problem?

    Read the article

  • Search for files after a relative date using Windows search

    - by Zoredache
    I am looking for a way to save a search that includes a relative date. Specifically I am looking for a way to save a search that matches files that have a modification date that is 7 days ago. I have read the Windows Search Advanced Query Syntax document and I am not seeing a way to say 7 days ago. The numbers and ranges section does mention that relative dates are possible. The problem is that the relative dates described there do not fit the criteria I need. The lastweek almost looks like what I want except if I run a query like after:lastweek on a Monday it will only show my file that have been modified since Sunday at 12:00. The lastweek/lastmonth seem to relative to the start of the week/month which is not what I need. Multi-word relative dates: week, next month, last week, past month, or coming year. The values can also be entered contracted, as follows: thisweek, nextmonth, lastweek, pastmonth, comingyear. One nice thing about saved searches is that they are stored as an XML document and the file format is documented. I am not seeing how to form a correct value for a datetime. If I was able to understand this format, I suspect I could use a text editor and created a saved search that does what I want. Fragment from the examples: <conditions> <condition type="leafCondition" valuetype="System.StructuredQueryType.DateTime" property="System.DateModified" operator="imp" value="R00UUUUUUUUZZXD-30NU" propertyType="wstr" /> </conditions> To summarize I am looking for an answer to one or both of these questions How do I make a query for '7 days ago' using the standard syntax? How is the DateTime stored in a saved search?

    Read the article

  • Can't include Javascript variable in PHP mysql_query call? [on hold]

    - by user198895
    I want the PHP mysql_query call to retrieve user values based on the Agency drop-down value but I can't get this to work. Am I unable to include the Javascript variable agency.value in PHP? <script type="text/javascript"> var agency = document.getElementById("agency"); var user = document.getElementById("user"); agency.onchange = onchange; // change options when agency is changed function onchange() { <?php include 'dbConnect.php'; ?> <?php $q = mysql_query("select id as UserID, CONCAT(LastName, ', ' , FirstName) as UserName from users where Agency = " . ?>agency.value<?php . " order by UserName");?> option_html = "<option value=0 selected>- All Users -</option>"; <?php while ($row1 = mysql_fetch_array($q)) {?> if (agency.value == 0 || agency.value == '<?php echo $row1[AgencyID]; ?>') { option_html += "<option value=<?php echo $row1[UserID]; ?>><?php echo $row1[UserName]; ?></option>"; } <?php } ?> user.innerHTML = option_html; } </script>

    Read the article

  • Dropbox failing to update?

    - by Mick
    I have two PC's at home. One XP the other Windows 7-64bit. I have a word document that I edit on either PC - to enable this I have the file stored in a Dropbox folder. This usually works fine - but sometimes I find that the file does not get updated on my windows7 PC. I.e. I edit the file on my Windows XP machine, then go to my Windows 7 PC and see that there is a previous, old, datestamp on the file and sure enough if I open up the file to have a look, I see that the latest edits are not included. If I right-click on the file and select "Browse on Dropbox website" I see that the latest file is correctly there. Surely there must be some option to say please update this file - but can find no such thing. Has something gone wrong? I should point out that my wireless internet connection is a little intermittent - could this have caused some glitch? By the way I do not leave the old file open in word on my Windows-7 PC as I can well imagine that would cause trouble. Also I should mention that the icon on the document has the little green tick on it showing that Dropbox is not in the process of doing a transfer. Also the Dropbox icon in my system tray also has the green tick - so Dropbox is not busy transferring some other file(s). If I hover the mouse over the Dropbox icon I get the tooltip "All files up to date".

    Read the article

  • How to get an ARM CPU clock speed in Linux?

    - by MiKy
    I have an ARM-based embedded machine based on S3C2416 board. According to the specifications I have available there should be a 533 MHz ARM9 (ARM926EJ-S according to /proc/cpuinfo), however the software running on it "feels" slow, compared to the same software on my Android phone with a 528MHz ARM CPU. /proc/cpuinfo tells me that BogoMIPS is 266.24. I know that I should not trust BogoMIPS regarding performance ("Bogo" = bogus), however I would like to get a measurement on the actual CPU speed. On x86, I could use the rdtsc instruction to get the time stamp counter, wait a second (sleep(1)), read the counter again to get an approximation on the CPU speed, and according to my experience, this value was close enough to the real CPU speed. How can I find the actual CPU speed of given ARM processor? Update I found this simple Pi calculator, which I compiled both for my Android phone and the ARM board. The results are as follows: S3C2416 # cat /proc/cpuinfo Processor : ARM926EJ-S rev 5 (v5l) BogoMIPS : 266.24 Features : swp half fastmult edsp java ... #./pi_arm 10000 Calculation of PI using FFT and AGM, ver. LG1.1.2-MP1.5.2a.memsave ... 8.50 sec. (real time) Android # cat /proc/cpuinfo Processor : ARMv6-compatible processor rev 2 (v6l) BogoMIPS : 527.56 Features : swp half thumb fastmult edsp java # ./pi_android 10000 Calculation of PI using FFT and AGM, ver. LG1.1.2-MP1.5.2a.memsave ... 5.95 sec. (real time) So it seems that the ARM926EJ-S is slower than my Android phone, but not twice slower as I would expect by the BogoMIPS figures. I am still unsure about the clock speed of the ARM9 CPU.

    Read the article

  • Resource Monitor (resmon) in Windows Server 2008 R2

    - by Clever Human
    In Windows Server 2008 R2's Resource Monitor, is there a way to set the scale of the various graphs to be constant values instead of variable based on data? It seems to me that the utility of a graph is to get a quick overview glance at the values those graphs are showing. So if I look at the CPU graph and the line is up near the top, I can know immediately that something is using all my CPU and go investigate what. I don't really care if the CPU is jumping between .01% and 2%. Or if the network usage monitor is up near the top, I will know that all my bandwidth is being used up, and go figure out what. But the way things are now, the graphs are meaningless because the scales constantly shift. If you look at the network usage graph in one second it might have a scale out of 100kbps, and the next second have a scale based on 1mbps! So... is there a registry key or something that will peg the scale of these graphs to logical maximums? (the graph on the right hand side of the screenshot below):

    Read the article

  • SORT empties my file?

    - by Jonathan Sampson
    I'm attempting to sort a csv on my machine, but I seem to be erasing the contents each time I use the sort command. I've basically created a copy of my csv lacking the first row: sed '1d' original.csv > newcopy.csv To confirm that my new copy exists lacking the first row I can check with head: head 1 newcopy.csv Sure enough, it finds my file and shows me the original second now (now first row). My csv consists of numerous values seperated by commas: Jonathan Sampson,,,,[email protected],,,GA,United States,, Jane Doe,Mrs,,,[email protected],,,FL,United States,32501, As indicated above, some fields are empty. I want to sort based upon the email address field, which is either 4, or 5 - depending on whether the sort command uses a zero-based index. So I'm trying the following: sort -t, +4 -5 newcopy.csv > newcopy.csv So I'm using -t, to indicate that my fields are terminated by the comma, rather than a space. I'm not sure if +4 -5 actually sorts on the email field or not - I could use some help here. And then newcopy.csv > newcopy.csv to overwrite the original file with new sort results. After I do this, if I try to read in the first line: head 1 newcopy.csv I get the following error: head: cannot open `1' for reading: No such file or directory == newcopy.csv <== Sure enough, if I check my directory the file is now empty, and 0 bytes.

    Read the article

  • Virtual Lan on the Cloud -- Help Confirm my understanding?

    - by marfarma
    [Note: Tried to post this over at ServerFault, but I don't have enough 'points' for more than one link. Powers that be, move this question over there.] Please give this a quick read and let me know if I'm missing something before I start trying to make this work. I'm not a systems admin professional, and I'd hate to end up banging my head into the wall if I can avoid it. Goals: Create a 'road-warrior' capable star shaped virtual LAN for consultants who spend the majority of their time on client sites, and who's firm has no physical network or servers. Enable CIFS access to a cloud-server based installation of Alfresco Allow Eventual implementation of some form of single-sign-on ( OpenLDAP server ) access to Alfresco and other server applications implemented in the future Given: All Servers will live in the public internet cloud (Rackspace Cloud Servers) OpenVPN Server will be a Linux disto, probably Ubuntu 9.x, installed on same server as Alfresco (at least to start) Staff will access server applications and resources from client sites, hotels, trains, planes, coffee shops or their homes over various ISP, using their company laptops or personal home desktops. Based on my Research thus far, to accomplish this, I'll need: OpenVPN with Bridging Enabled to create a star shaped "virtual" LAN http://openvpn.net/index.php/open-source/documentation/miscellaneous/76-ethernet-bridging.html A Road Warrior Network Configuration, as described in this Shorewall article (lower down the page) http://www.shorewall.net/OPENVPN.html Configure bridge addressesing (probably DHCP) http://openvpn.net/index.php/open-source/faq.html#bridge-addressing Configure CIFS / Samba to accept VPN IP address http://serverfault.com/questions/137933/howto-access-samba-share-over-vpn-tunnel Set up Client software, with keys configured for access (potentially through a OpenVPN-Sa client portal) http://www.openvpn.net/index.php/access-server/download-openvpn-as/221-installation-overview.html

    Read the article

  • Make Google Chrome's address bar prefer page titles to domain names when offering completions?

    - by Ryan Thompson
    I've recently switched from Firefox to Chrome, and the thing I miss most from Firefox is the "Awesome Bar" that suggests completions for what I type primarily based on page titles, and then secondarily based on domain names. Chrome offers both matching URLs and titles, just like like Firefox, but Chrome seems to always prefer a matching domain name over a matching page title or a match to another part of the URL (besides the domain), no matter how many times I pass over the former for the latter. In fact, Chrome also prefers to suggest a search rather than matching anything other than a domain name. So is there any hidden preference I can change to tell Chrome that I care more about page titles than domain names? Example: I want to go to Google Reader, so I press Control+L and begin typing "reader". The URL for google reader is http://www.google.com/reader/view/#overview-page, so the domain name is www.google.com, which does not contain the word "reader". So the first option that Chrome suggests is either another site that has "reader" as part of the domain, or a search for "reader" with the default search engine. No matter how many times I scroll down and select Google Reader, Chrome never "learns" that that's what I want.

    Read the article

  • Laptop authentication/logon via accelerometer tilt, flip, and twist

    - by wonsungi
    Looking for another application/technology: A number of years ago, I read about a novel way to authenticate and log on to a laptop. The user simply had to hold the laptop in the air and execute a simple series of tilts and flips to the laptop. By logging accelerometer data, this creates a unique signature for the user. Even if an attacker watched and repeated the exact same motions, the attacker could not replicate the user's movements closely enough. I am looking for information about this technology again, but I can't find anything. It may have been an actual feature on a laptop, or it may have just been a research project. I think I read about it in a magazine like Wired. Does anyone have more information about authentication via unique accelerometer signatures? Here are the closest articles I have been able to find: Knock-based commands for your Linux laptop Shake Well Before Use: Authentication Based on Accelerometer Data[PDF] Inferring Identity using Accelerometers in Television Remote Controls User Evaluation of Lightweight User Authentication with a Single Tri-Axis Accelerometer Identifying Users of Portable Devices from Gait Pattern with Accelerometers[PDF] 3D Signature Biometrics Using Curvature Moments[PDF] MoViSign: A novel authentication mechanism using mobile virtual signatures

    Read the article

  • problem with .Net xml importnode in powershell

    - by Trondh
    Hi, Im trying to construct a powershell script that uses some XML. I have a XML document where I try to add some values with email addresses. The finished xml document should have this format: (I'm only showing the relevant part of the xml here) <emailAddresses> <value>[email protected]</value> <value>[email protected]</value> <value>[email protected]</value> </emailAddresses> SO, in powershell I try to do this as a test, which fails: $newNumber = [xml] '<value>555-1215</value>' $newNode = $Request2.ImportNode($newNumber.value, $true) $emailnode.AppendChild($newNode) After some reading, I have figured out that if I do this, it suceeds: $newNumber = [xml] '<value name="flubber">555-1215</value>' $newNode = $Request2.ImportNode($newNumber.value, $true) $emailnode.AppendChild($newNode) So, I am stuck. I'm starting to wonder if I should use another function instead of importnode when I have several keys with the same name but different values. As you guys probably have figured out by now, i'm not an expert in xml. ANy help appreciated!

    Read the article

  • Backup strategy for developer-focused Apple environments?

    - by ewwhite
    It's interesting to see the technological split between structured corporate environments and more developer-driven/startup environments. Some of the Microsoft technologies I take for granted (VSS, Folder Redirection, etc.) simply are not available when managing the increasing number of Apple laptops I see in DevOps shops. I'm interested in centralized and automated backup strategies for a group of 30-40 Apple laptops... How is this typically done safely and securely, assuming these are company-owned machines (versus BYOD)? While Apple has Time Machine, it's geared toward individual computer backups and doesn't seem to work reliably in a group setting. Another issue with these workstations is the presence of Vagrant/Virtual Box VMs on the developers' systems. Time Machine and virtual machines typically don't work well unless the VMs are excluded from the backup set. I'd like a push-based backup process with some flexible scheduling options. I know how to handle the backend storage, but I'm not sure on what needs to be presented to the client systems. Due to the nature of the data here, cloud-based backup may not be a viable option. Any suggestions about how you handle this in your environment would be appreciated. Edit: The virtual machine backups are no longer important. They can be excluded from the process and planning.

    Read the article

  • Powershell script to delete sub folders and files if creation date is >7 days but maintain parent folders of sub folders and files <7 days old

    - by Mark
    I'm currently using the Powershell script below to delete all files directories and sub directories of "$dump_path" that are seven days or older based upon the creation date and not modified date. The problem with this script is this: If folder "A" is seven (or more) days old it will be deleted even if its sub folders and files are less then seven days old. What I would like this script to do is this: Delete all files from the root and in all sub folders of "$dump_path" that are seven or more days old but maintain the parent folder(s) of files and folders that are less than seven days old even if that means the parent folders are more than seven days old. If all subfolders and files are seven days or older than the parent folder then the parent can be deleted. Slightly obscure problem I know, but the intention is to have a 7 day retention period of all data in a 'sandbox' location of our shared areas. Also, an added bonus if it could generate a log of what it deletes and e-mails it out post deletion. Thank you for reading and I hope that all makes sense! Mark # set folder path $dump_path = "c:\temp" # set minimum age of files and folders $max_days = "-7" # get the current date $curr_date = Get-Date # determine how far back we go based on current date $del_date = $curr_date.AddDays($max_days) # delete the files and folders Get-ChildItem $dump_path | Where-Object { $_.CreationTime -lt $del_date } | Remove-Item -Recurse

    Read the article

  • Talk on multiple IRC channels at once?

    - by TwoPixelGrid
    I seem to remember, back in '91 or so, that the console-based IRCII implemention on the Solaris box that first got me on the net would let me /Join multiple channels on a given network such that, as new channels were joined, they would start scrolling to the single console view. Let's call it the 'interleaved conversation' chat paradigm. Am I rembering this correctly? More importantly, is there a modern way of doing this in any of the GUI-based clients? I'm surprised this isn't a common desire/feature because I think it would greatly improve the experience, especially on channels with high SNR. For example, If I'm working on a project I may connect to Freenode and join : #Qt,#OpenGL,#C++. As it is now, with mIRC,Xchat, I have to manually flip between pages just to see whats being said and to reply. What I envision would go more like this (using only 2 channels for simplicity) /join #QT #OpenGL < [QT] QtChannelUser: Hello TwoPixelGrid. < [OpenGL] OpenGLChannelUser: Hi there TwoPixelGrid. @QT: Hi QtChannelUser @OpenGL: Hello againOpenGLChannelUser And this message is going out to all my channels. Do I have to write a new client or is this already out there?

    Read the article

  • Postfix selective header_checks: smtpd_relay_restrictions vs. smtpd_recipient_restrictions

    - by luke
    Some of my customers implemented commercial software that violate email-RFCs such that we have had to relax our header checks. In consequence, we receive more spam. Prolog: I know the domains (customer.com) and IP-addresses (a.b.c.d/C) these emails come from Kind request for help: Is it possible to setup one Postfix (2.11) instance on Linux such that: It applies only some header checks for emails from .*@customer.com But applies all header checks for all other email sources? I thought of a combination of mynetworks that includes the subnet a.b.c.d/C in smtpd_recipient_restrictions -- allowing all these messages through -- and simultaneously avoid an open-relay with smtpd_relay_restrictions. However, this has not worked out as expected. Any idea or help is highly appreciated. Thanks in advance. Luke ==EDIT== For the current issue, I solved the problem by prepending REDIRECTs to header_checks as follows: /^received: from.*customer.com.*by mail.own.com.*for.*luke@own.*/ REDIRECT [email protected] This works so far as neeeded. Irrespective thereof, I am still looking for a postfix configuration that would turn this text-based setting into an IP-Address-Range based forwarding rule.... Thanks. Luke

    Read the article

  • Tool to modify properties/metadata of a PDF? i.e. Change "Title", "Author"? Sony Reader showing som

    - by Chris W. Rea
    I own a Sony Reader PRS-600 ebook reader. I bought a ton of Manning Publications ebooks (DRM-free) recently. Many of the books are PDFs since not all the ones I wanted are available in epub format. The problem: Some of the PDF books I purchased have incorrect or missing metadata. Making things worse, the Sony Reader only displays the "Title" from the PDF metadata when displaying book titles in the reader's collection of books! The Reader doesn't display the filename. So, even though I have a PDF informatively named "Windows PowerShell In Action.pdf", it shows up as "untitled" in the Reader. Imagine how useful the Reader's list of book titles becomes when many are just "untitled" or "unnamed document" ! Yes, it is maddening. So – short of expecting the publisher to fix the files or Sony to add a filename-based list instead, I'm looking for a way to fix the PDF metadata. I can view the metadata with Adobe Reader, but it doesn't permit modification of the properties. Leading to: Question: Is there a tool – free, or cheap – and either for PC or Mac, that can modify the properties / metadata of a DRM-free PDF document? I want to correct "Title" and "Author" fields, specifically.

    Read the article

  • How to Programmatically Split Data Using VBA Using Specific Logic

    - by Charlene
    This is an addition to my previous post here. The code that was previously supplied to me worked like a charm, but I am having issues modifying it adding some additional logic. I am creating a macro in VBA to do the following. I have raw order data that I need to transform based on some logic. Raw Data: order-id product-num date buyer-name prod-name qty-purc sales-tax freight order-st 0000000000-00 10000000000000 5/29/2014 John Doe Product 0 1 1.00 1.50 GA 0000000000-00 10000000000001 5/29/2014 John Doe Product 1 2 1.00 1.50 GA 0000000000-00 10000000000002 5/29/2014 John Doe Product 2 1 1.00 2.00 GA 0000000000-01 10000000000002 5/30/2014 Jane Doe Product 2 1 0.00 0.00 PA 0000000000-01 10000000000003 5/30/2014 Jane Doe Product 3 1 0.00 0.00 PA Desired Outcome: HDR 0000000000-00 John Doe 5/29/2014 CHG Tax 3.00 CHG Freight 5.00 ITM 10000000000000 Product 0 1 ITM 10000000000001 Product 1 2 ITM 10000000000002 Product 2 1 HDR 0000000000-01 Jane Doe 5/30/2014 ITM 10000000000002 Product 2 1 ITM 10000000000003 Product 3 1 The "CHG" rows are created based on the following logic; if the order-st is CA or GA, add the total of sales-tax and freight for each of the rows with the same order-id. If the order-st is NOT CA or GA, no CHG rows should be created. Any help would be appreciated - let me know if I left any details out!

    Read the article

  • Nginx Slower than Apache??

    - by ichilton
    Hi, I've just setup 2x identical Rackspace Cloud instances and am doing some comparisons and benchmarks to compare Apache and Nginx. I'm testing with a 3.4k png file and initially 512MB server instances but have now moved to 1024MB server instances. I'm very surprised to see that whatever I try, Apache seems to consistently outperform Nginx....what am I doing wrong? Nginx: Server Software: nginx/0.8.54 Server Port: 80 Document Length: 3400 bytes Concurrency Level: 100 Time taken for tests: 2.320 seconds Complete requests: 1000 Failed requests: 0 Write errors: 0 Total transferred: 3612000 bytes HTML transferred: 3400000 bytes Requests per second: 431.01 [#/sec] (mean) Time per request: 232.014 [ms] (mean) Time per request: 2.320 [ms] (mean, across all concurrent requests) Transfer rate: 1520.31 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 0 11 15.7 3 120 Processing: 1 35 76.9 20 1674 Waiting: 1 31 73.0 19 1674 Total: 1 46 79.1 21 1693 Percentage of the requests served within a certain time (ms) 50% 21 66% 39 75% 40 80% 40 90% 98 95% 136 98% 269 99% 334 100% 1693 (longest request) And Apache: Server Software: Apache/2.2.16 Server Port: 80 Document Length: 3400 bytes Concurrency Level: 100 Time taken for tests: 1.346 seconds Complete requests: 1000 Failed requests: 0 Write errors: 0 Total transferred: 3647000 bytes HTML transferred: 3400000 bytes Requests per second: 742.90 [#/sec] (mean) Time per request: 134.608 [ms] (mean) Time per request: 1.346 [ms] (mean, across all concurrent requests) Transfer rate: 2645.85 [Kbytes/sec] received Connection Times (ms) min mean[+/-sd] median max Connect: 0 1 3.7 0 27 Processing: 0 3 6.2 1 29 Waiting: 0 2 5.0 1 29 Total: 1 4 7.0 1 29 Percentage of the requests served within a certain time (ms) 50% 1 66% 1 75% 1 80% 1 90% 17 95% 19 98% 26 99% 27 100% 29 (longest request) I'm currently using worker_processes 4; and worker_connections 1024; but i've tried and benchmarked different values and see the same behaviour on all - I just can't get it to perform as well as Apache and from what i've read previously, i'm shocked about this! Can anyone give any advice? Thanks, Ian

    Read the article

< Previous Page | 436 437 438 439 440 441 442 443 444 445 446 447  | Next Page >