Search Results

Search found 748 results on 30 pages for 'periodically'.

Page 10/30 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Why doesn't update-manager allow me to upgrade distribution?

    - by spoulson
    I have an Ubuntu 9.04 PC behind a corporate firewall and proxy server. This requires that in order to get update-manager to fetch and apply updates, I must set the proxy and authentication settings in the Synaptic network configuration. Once done, I can check for updates and things work smoothly (except I don't get popup notifications of new updates, must manually check periodically). However, distribution updates just don't show up in update-manager, such as the newly released 9.10 Karmic Koala. I had the same issue in upgrading 8.10 to 9.04 and solved it by downloading and upgrading from the 9.04 ISO. What do I need to do to upgrade to 9.10 using the standard update-manager UI?

    Read the article

  • Data Execution Prevention problem in Windows Server 2008

    - by naveen
    Hi guys, I am an ASP.NET developer, who has minimal knowledge in Server administration. I have a database hosted at Windows Server 2008. Today morning onwards it periodically stops working. The message given is something to the effect "The program is being shut down to prevent Data Execution Prevention error" Some other programs are also showing the behavior. I would like to know, what causes this all of a sudden? The server is UN-protected(IE: no anti-virus installed at all), could this be a possible anti-virus/ malware attack? What do we need to do to get the SOL running smoothly again? Regards, Naveen Jose

    Read the article

  • Event-based server-gameloop in a server based game

    - by Chris
    I know that this site is full of questions about fixed gameloops and variable gameloops and different types of threading. But I coult find barely nothing that is related to server loops. The server has no screen to draw on. It could just run as fast as possible, but of course this makes no sense. But should it really use single "ticks" and send the updates periodically after each tick and wait for the next "tick" to update its state. Is it applicable to replace the gameloop by multilpe events? Suchs as incoming network traffic or timers? I often heared that a gameloop should be determistic, but does it really matter? For instance, when you play a shooter game against humand players and/or AI you proably would never be ably to repeat the same input twice. Is it a good idea to lose determistic behavior if it is nearly impossible to reprodruce the same input twice? So this question is more or less about whether an strictly event-based gameloop is adviseable or not and what are the pros and cons. I could imagene that an event-based gameloop could perform much faster and smoother, since you don't have bug CPU-spikes during the beginning of a new "tick". The fact that I could not find much about an event-based gameloop for servers leads me to the conclusion that inefficient or too complicated to get a real benefit from it. I'm sure if this is enough to get an idea from what I'm interessted to know, but I hope so.

    Read the article

  • DFS keeps constantly replicating almost all files

    - by Adrian Godong
    We have always had problems with DFS, but recently it has gotten worse (with no apparent reason) to the point it's becoming harmful. We have one master server and DFS connections to other four servers. The four severs don't modify any files, so all replications always propagate from the master to the four other servers. The replicated directory has about 900,000 files. In the recent weeks, every time we check DFS, the DSF backlogs have hundredths of thousand of files. For instance now, the master server now replicates about 700,000 to three of the four servers while the fourth one is fine. Sometimes, only one is off, sometimes two and this time three. Also, it is never the same set of servers. It is inconceivable that something periodically touches all 900,000 files. The biggest change which happens is a scheduled update of several thousand files every six hours. Does anybody have the same problem? Is it a known issue?

    Read the article

  • Redirecting X output

    - by Adam Matan
    Hi, I have a small program that checks some elements of a web service. The program shows graphics output and displays commmand-line results as well. I have been trying to automate this program to run periodically on a server in my office. Problem is, It only works when I have X enabled - either directly on the server, or via ssh -X. Following Google, I have tried Xvfb, which gave me quite cryptic error message: Xvfb :1 -screen 0 1600x1200x32 Fatal server error: Server is already active for display 1 If this server is no longer running, remove /tmp/.X1-lock and start again. Any ideas how to run it? I'm actually looking for the X equivalent of &>/dev/null... Thanks in advance, Adam

    Read the article

  • Windows Server 2008 (sp2) stops responding on network share requests from Windows Vista and 7 client

    - by Peter LaComb Jr.
    I have two Windows Server 2008 SP2 machines (TFS and TFSBUILD). Periodically, the TFSBUILD server shares (\TFSBUILD\ShareName or \TFSBUILD\C$) become unresponsive to requests from Windows Vista (Server 2008) and Windows 7 client requests. Windows XP machines are still able to connect. No events in the server log indicate any problem. A simple restart corrects the issue temporarily, but it always returns. No, it is not this http://support.microsoft.com/kb/976266 (we aren't using that software). All anti-virus software has been disabled, firewall is disabled by policy. No other network activity is affected. Any help would be greatly appreciated.

    Read the article

  • Swiftfox (Firefox variant) is taking ages to load in Debian Lenny

    - by unixbhaskar
    Swiftfox (a Firefox variant) is taking ages to load in my Debian Lenny system. I have been using Swiftfox on several other Linux distributions without any problem; it works flawlessly on Arch, Fedora, Gentoo (my other OSes). But it's giving me trouble in Debian Lenny. I haven't yet found any clue as to why. Please help me shed some light on the problem. I periodically prune the Swiftfox/Firefox internal database, and I keep it up-to-date with the latest versions available. I am confused with this problem in Debian Lenny. What should I try to troubleshoot this further? Cheers!

    Read the article

  • Network / Internet diagnostic tool to locate an error?

    - by Jesper
    Hi, I’m facing some difficulties with the Internet / network at my work and I have trouble locating the precise error and when and how it occurs. The problem is that the client machines in the house sporadic is disconnected to the Internet. I’m some what new so I haven’t that much inside in the network and apparently neither has my predecessor. What I am requesting and hoping you guys knows about is, if there exist some kind of network monitor tool I can install and run and it will periodically check the network, the Internet connection etc. and record to logs. Then, if there suddenly arises a problem some time of the day in some part of the network or the Internet connection, I can check it perhaps the next day. I’ve just downloaded and installed Microsoft Network Monitor 3.3 application and hopeful it can give me some answers on where the instability is located but I still would like a tool to make different checks and test in some time interval. Do anyone know about such a program or another kind of performance / diagnostic tool / method I can use? Sincere Jesper

    Read the article

  • Server vendor that allows 3rd party disks

    - by Alvin S
    As noted here, Dell is no longer allowing 3rd party disks to be used with their latest servers. As in, they don't work period. Which means that if you buy one of these boxes and want to upgrade the storage later, you have buy disks from Dell at significant premiums. Dell has just given me a very strong reason to take my server business elsewhere. My company buys (instead of leasing) our servers, and typically uses them for 5 years. I need to be able to upgrade/repurpose storage periodically, and do not want to be locked in to whatever Dell might have in stock, at inflated prices to boot. As you will see in the comments of the above link, it seems HP is doing the same thing. I am looking for a server vendor that offers 3-5 year warranty with same day/next day onsite service, and allows me to use 3rd party disks. Suggestions?

    Read the article

  • Git-based storage and publishing, infrastructure advice

    - by Joel Martinez
    I wanted to get some advice on moving a system to "the cloud" ... specifically, I'm looking to move into some of Windows Azure's managed services, as right now I'm managing a VM. Basically, the system operates on some data stored in a github git repository. I'll describe the current architecture: Current system (all hosted on a single server): GitHub - configured with a webhook pointing at ... ASP.NET MVC application - to accept the webhook from git. It pushes a message onto ... Azure service bus Queue - which is drained by ... Windows Service - pulls the message from the queue and ... Fetches the latest data from the git repository (using GitLib2Sharp) onto the local disk and finally ... Operates on the data in git to produce a static HTML website hosted/served by IIS. The system works really well, actually ... but I would like to get out of the business of managing the VM, and move to using some combination of Azure web and worker roles. But because the system relies so heavily on the git repository on the local filesystem, I'm finding it difficult to figure out how to architect in the cloud. I know you can get file system access, so in theory I could just fetch the repository if there's nothing on disk ... but the performance/responsiveness of the system sort of depends on the repository being available and only having to fetch diffs, which is relatively quick. As opposed to periodically having to fetch the entire (somewhat large) git repository if the web or worker role was recycled, or something. So I would love some advice on how you would architect such a system :) Ultimately, the only real requirement is to be able to serve HTML content that's been produced from the contents of a git repository (in a relatively responsive manner, from a publishing perspective) ... please feel free to ask any clarifying questions if there's something I omitted. Thanks!

    Read the article

  • Deploying new code live

    - by nicoX
    What's the best practise to deploy new code on a live (e-commerce) site? For now I have stopped apache for +/- 10 seconds when renaming directory public_html_new to public_html and old to public_html_old. This creates a short down-time, before I start Apache again. The same question goes if using Git to pull the new repo to the live directory. Can I pull the repo while the site is active? And how about if I need to copy a DB as well? During the tar (backup purpose) compression of the live site I noticed that changes occurred in the media directory. That indicated to me that files keep on changing periodically. And if these changes can interfere if Apache is not stopped during deployment.

    Read the article

  • git svn on multiple machines

    - by stgtscc
    My repo is SVN and I'm using git-svn to interface with it which has been working out well. I'm working on the code base from a few different machines and appreciate some insight as to what the best setup might be for me going forward. I'd like to use git primarily but I need to commit to svn (via git svn dcommit) and pull from svn (git svn rebase) periodically from potentially any of the machines. Is it possible to perhaps have git svn setup on all but somehow push and pull changes between the instances? Or should I setup a bare repo and use that as the central git repo? How would that tie in to git svn? Any insight is appreciated.

    Read the article

  • Why doesn't update-manager allow me to upgrade distribution?

    - by spoulson
    I have an Ubuntu 9.04 PC behind a corporate firewall and proxy server. This requires that in order to get update-manager to fetch and apply updates, I must set the proxy and authentication settings in the Synaptic network configuration. Once done, I can check for updates and things work smoothly (except I don't get popup notifications of new updates, must manually check periodically). However, distribution updates just don't show up in update-manager, such as the newly released 9.10 Karmic Koala. I had the same issue in upgrading 8.10 to 9.04 and solved it by downloading and upgrading from the 9.04 ISO. What do I need to do to upgrade to 9.10 using the standard update-manager UI?

    Read the article

  • NET VIEW occasionally does not show all computers

    - by Neil
    I am having an issue on my workstation, where periodically the NET VIEW command does not seem to work. Most times, NET VIEW will return a list of all the machines connected to our domain. Occasionally, it will only list 4 or 5 (instead of the usual 200+). The only way I've found to remedy it is by rebooting my machine. I am running Windows XP Pro, the Domain Controller is running Server 2003. What could be causing this, and how can I remedy it? Thanks!

    Read the article

  • Instructor Insight: Using the Container Database in Oracle Database 12 c

    - by Breanne Cooley
    The first time I examined the Oracle Database 12c architecture, I wasn’t quite sure what I thought about the Container Database (CDB). In the current release of the Oracle RDBMS, the administrator now has a choice of whether or not to employ a CDB. Bundling Databases Inside One Container In today’s IT industry, consolidation is a common challenge. With potentially hundreds of databases to manage and maintain, an administrator will require a great deal of time and resources to upgrade and patch software. Why not consider deploying a container database to streamline this activity? By “bundling” several databases together inside one container, in the form of a pluggable database, we can save on overhead process resources and CPU time. Furthermore, we can reduce the human effort required for periodically patching and maintaining the software. Minimizing Storage Most IT professionals understand the concept of storage, as in solid state or non-rotating. Let’s take one-to-many databases and “plug” them into ONE designated container database. We can minimize many redundant pieces that would otherwise require separate storage and architecture, as was the case in previous releases of the Oracle RDBMS. The data dictionary can be housed and shared in one CDB, with individual metadata content for each pluggable database. We also won’t need as many background processes either, thus reducing the overhead cost of the CPU resource. Improve Security Levels within Each Pluggable Database  We can now segregate the CDB-administrator role from that of the pluggable-database administrator as well, achieving improved security levels within each pluggable database and within the CDB. And if the administrator chooses to use the non-CDB architecture, everything is backwards compatible, too.  The bottom line: it's a good idea to at least consider using a CDB. -Christopher Andrews, Senior Principal Instructor, Oracle University

    Read the article

  • Should EC2 server self register or have admin server

    - by hortitude
    I'm creating AWS servers using chef. I am also planning on enabling auto-scaling. While we have automatic monitoring setup already (server density or nagios etc), I was also going to setup the cloud watch alarm for the status check on the server. This led me down the path of trying to decide whether to install the ec2-command-line tools on the server itself (which then requires me to install java on the servers -- despite no other need for Java in our environment) or to possible have an "admin" box that will check periodically for servers and make sure they have their alarms set. I expect this paradigm to carry over to other things we want to configure (perhaps ensuring that termination protection is setup on production servers?). Any thoughts as to why to go one direction or another?

    Read the article

  • Windows SBS 2011 DNS Role (service) failing & needing restarting

    - by HaydnWVN
    Have a Windows SBS 2011 with Exchange that is handling all DNS for the network. Since getting a 3rd party (Hardware & Support) to setup a recieving FTP service and restricting Exchanges memory useage for another 3rd party product (Stock software) the local network seems to periodically 'lose the internet connection'. Delving deeper I found that the DNS service is somehow failing/stopping without the actual service on the server reporting such (nothing in event logs). A simple restart of the 'DNS Role' on the server solves the problem. The manager onsite reports that he has to do this most days in the afternoon - yet not at the same time and other days it works fine without a restart being required. I'm unable (lacking sufficient SBS2011 knowledge) to diagnose this further, ideally I would like the DNS Role to report (and log) the failure, then automatically restart itself.

    Read the article

  • Replace Whole Site by FTP

    - by Sam Machin
    Hi There, I've got a set of tools which periodically (about once a day) generate a complete set of static HTML pages for a site with associated folder structure etc. I then need to put those file onto the production server, my problem is that the server runs IIS(6 I think) and I only have regular FTP access. I need a way to automate the process of publishing the new site and it needs to a total replacement of the files each time its published, eg delete the whole folder & contents then put the new ones up. My source server is a ubuntu machine and I've got total control at that end, I have tried using CurlFTpFS but it seems to be too slow for what I'm trying to do and locks up. Rgds Sam

    Read the article

  • Cannot FTP without simultaneous SSH connection?

    - by Lucas
    I'm trying to set up an old box as a backup server (running 10.04.4 LTS). I intend to use 3rd party software on my PC to periodically connect to my server via FTP(S) and to mirror certain files. For some reason, all FTP connection attempts fail UNLESS I'm simultaneously connected via SSH. For example, if I use putty to test the connection to port 21, the system hangs and times out. I get: 220 Connected to LeServer USER lucas 331 Please specify the password. PASS [password] <cursor> However, when I'm simultaneously logged in (in another session) everything works: 220 Connected to LeServer USER lucas 331 Please specify the password. PASS [password] 230 Login successful. Basically, this means that my software will never be able to connect on its own, as intended. I know that the correct port is open because it works (sometimes) and nmap gives me: Starting Nmap 5.00 ( http://nmap.org ) at 2012-03-20 16:15 CDT Interesting ports on xx.xxx.xx.x: Not shown: 995 closed ports PORT STATE SERVICE 21/tcp open ftp 22/tcp open ssh 53/tcp open domain 139/tcp open netbios-ssn 445/tcp open microsoft-ds Nmap done: 1 IP address (1 host up) scanned in 0.15 seconds My only hypothesis is that this has something to do with iptables. Maybe it's allowing only established connections? I don't think that's how I set it up, but maybe? Here's my iptables rules for INPUT: lucas@rearden:~$ sudo iptables -L INPUT Chain INPUT (policy DROP) target prot opt source destination fail2ban-ssh tcp -- anywhere anywhere multiport dports ssh ufw-before-logging-input all -- anywhere anywhere ufw-before-input all -- anywhere anywhere ufw-after-input all -- anywhere anywhere ufw-after-logging-input all -- anywhere anywhere ufw-reject-input all -- anywhere anywhere ufw-track-input all -- anywhere anywhere ACCEPT tcp -- anywhere anywhere tcp dpt:ftp I'm using vsftpd. Any thoughts/resources on how I could fix this? L

    Read the article

  • How can I selectively increase latency? E.g. throttle games

    - by Arcymag
    Basically, I want networked games to run poorly on a network, but I want everything else to run smoothly. I would also appreciate advice on blocking games in general. As far as I can tell, there's a few ways to completely prevent an internet game from running: Blocking entirely via DNS configuration (e.g. hosts file), or router DNS configuration Blocking entirely via a separate DNS server Blocking the application, by uninstalling or some kind of access control Blocking the application by automatically killing the process every once in a while Blocking the application by corrupting files periodically However, I would like a more subtle way to block a program. Something that either: Increases latency (would this be doable through some kind of QoS like what DD-WRT offers?) Increases latency by using a special routing configuration for specific target IPs Throttle other systems resources, such as memory, IO, or CPU Screw around with keyboard configurations when a game is launched I would like this to work on MacOSX and Windows, but Linux would be great too. FYI I don't have a kid, but I was brainstorming with some friends and parents.

    Read the article

  • UNRAID V4.7: Lost write permission on Win7/Android devices

    - by JROC
    I'm currently running V4.7 and I haven't touched any of the user or share settings, and I'm periodically losing read.write permission on both my windows 7 pc and my android tablet connecting over the wireless. Sometime I can access my shares and see the folder directories, but when attempting to open a folder windows denies me access saying I don't have the proper permission. This is after I have logged in with my main account that has full read/write access of everything, same on my android device. This all started when I attempted to delete a large amount of files (8gb) to make more room and about half way through started getting permission errors. What could be causing this? Thanks

    Read the article

  • UNRAID V4.7: Lost write permission on Win7/Android devices

    - by JROC
    I'm currently running V4.7 and I haven't touched any of the user or share settings, and I'm periodically losing read.write permission on both my windows 7 pc and my android tablet connecting over the wireless. Sometime I can access my shares and see the folder directories, but when attempting to open a folder windows denies me access saying I don't have the proper permission. This is after I have logged in with my main account that has full read/write access of everything, same on my android device. This all started when I attempted to delete a large amount of files (8gb) to make more room and about half way through started getting permission errors. What could be causing this? Thanks

    Read the article

  • Case Study: Polystar Improves Telecom Networks Performance with Embedded MySQL

    - by Bertrand Matthelié
    Polystar delivers and supports systems that increase the quality, revenue and customer satisfaction of telecommunication services. Headquarted in Sweden, Polystar helps operators worldwide including Telia, Tele2, Telekom Malysia and T-Mobile to monitor their network performance and improve service levels. Challenges Deliver complete turnkey solutions to customers integrating a database ensuring high performance at scale, while being very easy to use, manage and optimize. Enable the implementation of distributed architectures including one database per server while maintaining a low Total Cost of Ownership (TCO). Avoid growing database complexity as the volume of mobile data to monitor and analyze drastically increases. Solution Evaluation of several databases and selection of MySQL based on its high performance, manageability, and low TCO. The MySQL databases implemented within the Polystar solutions handle on average 3,000 to 5,000 transactions per second. Up to 50 million records are inserted every day in each database. Typical installations include between 50 and 100 MySQL databases, up to 300 for the largest ones. Data is then periodically aggregated, with the original records being overwritten, as the need for detailed information becomes unnecessary to operators after a few weeks. The exponential growth in mobile data traffic driven by the proliferation of smartphones and usage of social media requires ever more powerful solutions to monitor, analyze and turn network data into actionable business intelligence. With MySQL, Polystar can deliver powerful, yet easy to manage, solutions to its customers. MySQL-based Polystar solutions enable operators to monitor, manage and improve the service levels of their telecom networks in over a dozen countries from a single location. The new and innovative MySQL features constantly delivered by Oracle help ensure Polystar that it will be able to meet its customer’s needs as they evolve. “MySQL has been a great embedded database choice for us. It delivers the high performance we need while remaining very easy to use, manage and tune. Power and simplicity at its best.” Mats Söderlindh, COO at Polystar.

    Read the article

  • How can dev teams prevent slow performance in consumer apps?

    - by Crashworks
    When I previously asked what's responsible for slow software, a few answers I've received suggested it was a social and management problem: This isn't a technical problem, it's a marketing and management problem.... Utimately, the product mangers are responsible to write the specs for what the user is supposed to get. Lots of things can go wrong: The product manager fails to put button response in the spec ... The QA folks do a mediocre job of testing against the spec ... if the product management and QA staff are all asleep at the wheel, we programmers can't make up for that. —Bob Murphy People work on good-size apps. As they work, performance problems creep in, just like bugs. The difference is - bugs are "bad" - they cry out "find me, and fix me". Performance problems just sit there and get worse. Programmers often think "Well, my code wouldn't have a performance problem. Rather, management needs to buy me a newer/bigger/faster machine." The fact is, if developers periodically just hunt for performance problems (which is actually very easy) they could simply clean them out. —Mike Dunlavey So, if this is a social problem, what social mechanisms can an organization put into place to avoid shipping slow software to its customers?

    Read the article

  • Is there any way to use the xscreensaver gltext to monitor system information?

    - by Shane
    I am trying to get the X screensaver gltext to monitor my system temp and maybe some other stats. Writing a script that puts together the stats periodically is no problem, but the main thing I'm running into is that gltext doesn't refresh - whatever text I feed it stays there. So, for example, I run this command: $ /usr/lib/xscreensaver/gltext -text "`cat /proc/acpi/thermal_zone/THRM/temperature`" and get a gltext window showing: temperature: 60 C I can manipulate it and format it as necessary, but as my CPU heats up and cools down, it doesn't update, even if I include time variables that do. I have a script that feeds gltext the time as the first line and the temp as the second line - and although the time updates continuously as the time changes, the temperature value remains the same as whenever the screensaver started. Is it possible to do what I want, which is change the text every 60 seconds if the temperature changes?

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >