Search Results

Search found 26214 results on 1049 pages for 'farm solution'.

Page 580/1049 | < Previous Page | 576 577 578 579 580 581 582 583 584 585 586 587  | Next Page >

  • Cannot Ping PS3 Using a Wired Connection

    - by Post
    PS3 is using the inbuilt wireless network adapter (I cannot change this) When I try to ping the PS3 from ANY computer which is on a wired ethernet connection, I get Request Timed Out errors. Whenever I ping from a computer with a wireless connection, it works just fine. To be clear: Pinging from Wireless PC to Wireless PS3 works Pinging from Wired PC to Wireless PS3 fails I have tried this on several PC's and Laptops all with the same results. As an attempted solution I have set up static IP's on all related devices. More information: Default Gateway = 192.168.2.1 PS3(wireless) = 192.168.2.100 PC(wired) = 192.168.2.99 Subnet Mask(for both devices, I have made sure) = 255.255.255.0 Thanks

    Read the article

  • Backup/Multihomed network connection

    - by J_P
    We have a couple locations that require 24/7 access to Internet and our current provider (AT&T) while mostly good is not always up. My concern would be if I go with another provider (for example Comcast) I'm going to be subject to the same down time if it's in the "last mile". I for the most part don't know where the failure points are on the ISP side but I would imagine the large majority are within the last mile. I'd looked at Mifi or similar solution but have concerns about bandwidth caps and overall speed. Any suggestions would be appreciated.

    Read the article

  • Controlling access to my API using SSH public key (not SSL)

    - by tharrison
    I have the challenge of implementing an API to be consumed by relatively non-technical clients -- pasting some sample code into their WordPress or homegrown PHP site is probably as much as we can ask. Asking them to install SSL on their servers ain't happening. So I am seeking a simple yet secure way to authenticate API clients. OAuth is the obvious solution, but I don't think it passes the "simple" test. Adding a client id and hashed secret as a parameter to the requests is closer -- it's not hard to do md5($secret . $client_id) or whatever the php would be. It seems to me that if client requests could use the same approach as SSH public keys (client gives us a key from their server(s) there should be some existing magic to make all of the subsequent transactions transparently work just as regular HTTP API requests. I am still working this out (obviously :-), so if I am being an idiot, it would be nice to know why. Thanks!

    Read the article

  • Is there software that can visualize all sounds from the sound card?

    - by bentsai
    I'm looking for a solution to this problem: When I'm working at my computer, sometimes I'll be using noise-isolation headphones connected to a source other than my computer (e.g., an iPod). I would like to see (this is what I mean by "visualize") some kind of notification on my screen, since I will not be able to hear the sound. Is there any software out there that would accomplish this? I'm interested in seeing any sounds that would normally come out of th sound card. This is for Windows (XP), but I'd be interested in hearing solutions for other flavors, and OS X, as well.

    Read the article

  • Using Redirect for an SSL CDN to have a Custom Domain

    - by bendytree
    We're looking to move our website/app assets to a CDN. The problem is, our CDN doesn't offer custom domain names with SSL. In other words, for SSL they offer https://1234.cdn.hostingcompany.com but not https://assets.mysite.com. So this seems like a huge problem since I don't want to re-publish my app with their domain hard coded. So I read somewhere about a method where you send people to https://assets.mysite.com then redirect to https://1234.cdn.hostingcompany.com. Is there merit to that solution or would that completely defeat the purpose of the CDN.

    Read the article

  • Crackling sound from right laptop speaker

    - by user1880405
    This problem lasts for several months already (first on Ubuntu 13.10, not on 12.04. I get very loud cracking/popping sound from my right Asus K56C's speaker, I searched everything but could not get rid of it. Several facts: There is no problem on Windows 8. It has nothing to do with applications running because it appears even before login screen of Ubuntu. Also same problem if I boot from Live USB. Muting sound will remove noise, but lowering volume has no effect. Inserting any headphones, removes the noise. If I disconnect power cable while there is noise, noise will always disappear, but only if there is no music playing. If I start playing music, noise again appear even with power cable disconnected. Sometimes that noise disappears for 1-4 weeks, and then again appears for no reason and lasts from several days to weeks. That noise is always the same, and I tried adding tsched=0 to /etc/pulse/default.pa. Also tried this PositionReporting fix, with no effect. I also tried disconnecting all the cables and removing all electronic devices around laptop, but it has no effect. I also tried removing Pulse Audio, didn't change anything. Would be great if someone has some real solution for this problem.

    Read the article

  • How do I get my Intel HD graphics to work alongside my HD7850, as my second(HDMI out) monitor?

    - by AlexTes
    Title says it all. Further info: Motherboard: http://www.asrock.com/mb/Intel/Z77%20Pro3/ Processor: http://ark.intel.com/products/65520/Intel-Core-i5-3570K-Processor-%286M-Cache-up-to-3_80-GHz%29 So currently my main screen is running on my HD7850. Got drivers from the amd website. I have looked through dozens of questions here. I'm about to try booting Ubuntu from a stick and seeing if the xorg-edgers drivers might help. When booting, all action goes down on the very screen I'm trying to get to work.*EDIT never mind this. Seems to be special boot magic. As the screen only displays whiteline errors once the gui of ubuntu has kicked in and everything graphic is happening through my graphics card again. Connected through HDMI(motherboard)-DVI. So unless having multiple displays is a huge deal the solution hopefully isn't that complicated. I just feel I'm missing something simple. If this really is complicated, I should probably just hook up the display to my graphics card. My CPU is usually the one chilling out though so I'd like to try to get that to work. Also just because I don't want to buy an extra cable and this set up makes me feel warm and fuzzy inside. Tell me what to try or look up, I'll be most appreciative. Thank you! **UPDATE The x-swat ppa installed some intel stuff. Booting with one monitor plugged into the motherboard gives nothing. Doing it with the pc already on gives the purple "Ubuntu" with 5 dots boot/shutdown screen.

    Read the article

  • How to secure a VM while allowing customer RDS (or equivalent) access to its desktop

    - by ChrisA
    We have a Windows Client/(SQL-)Server application which is normally installed at the customer's premises. We now need to provide a hosted solution, and browser-based isn't feasible in the short term. We're considering hosting the database ourselves, and also hosting the client in a VM. We can set all this up easily enough, so we need to: ensure that the customer can connect easily, and also ensure that we suitably restrict access to the VM (and its host, of course) We already access the host and guest machines across the internet via RDS, but we restrict access to it to only our own internal, very small, set of static IPs, and of course theres the 2 (or 3?)-user limit on RDS connections to a remote server. So I'd greatly appreciate ideas on how to manage: the security the multi-user aspect. We're hoping to be able to do this initially without a large investment in virtualisation infrastructure - it would be one customer only to start with, with perhaps two remote users. Thanks!

    Read the article

  • Can I share data between two users in an ASP.NET Application?

    - by Dave
    I have an issue with Roles.IsUserInRole function. It take hell amount of time to just check if the logged-in user is in particular role(typ 3-9 sec). I searched to find a solution and arrived on this but I am not sure If I have fully grasped it. What I got from the above, A new derived class is created. Inside that class, there is a list which retrieves all user at once. The next time you check IsUserInRole, you do not use the actual IsUserInRole method but rather use the one you overrode in your class. Is this the correct description? Am I on track? My question is, can data be share between two different users in ASP.NET application? If yes, will the shared data exist only if there is at least one user logged in. If all users logs out, that shared data is destroyed? My point is this data will be created only one time whenever a user logs in. For all subsequent users they can use this data and check their roles against the list? I need a detailed answer. My application has users and different roles. We are using ASP.NET roles.

    Read the article

  • Wifi Hotspot not created in ubuntu 12.04

    - by user2406568
    I am using Hp Pavillion g4 with braodcom wireledd adpater. I have the following hardware configuration, for lan and wifi eth0 Link encap:Ethernet HWaddr 10:1f:74:b2:61:cc inet addr:10.3.10.45 Bcast:10.3.11.255 Mask:255.255.252.0 inet6 addr: fe80::121f:74ff:feb2:61cc/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:190855 errors:0 dropped:0 overruns:0 frame:0 TX packets:133209 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:77642990 (77.6 MB) TX bytes:25290447 (25.2 MB) Interrupt:44 Base address:0x6000 eth1 Link encap:Ethernet HWaddr 38:59:f9:7d:d6:b2 inet addr:10.3.9.180 Bcast:10.3.11.255 Mask:255.255.252.0 inet6 addr: fe80::3a59:f9ff:fe7d:d6b2/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:245562 errors:82 dropped:0 overruns:0 frame:408011 TX packets:90383 errors:260 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:140772881 (140.7 MB) TX bytes:13041542 (13.0 MB) Interrupt:16 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:3230 errors:0 dropped:0 overruns:0 frame:0 TX packets:3230 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:435198 (435.1 KB) TX bytes:435198 (435.1 KB) whenever I click on create hotspot nothing happens. Any solution???

    Read the article

  • Installing multiple versions of a shared library

    - by nsfyn55
    I am running ubuntu 10.04 and I want to use tmux 1.6. tmux has a dependency on libevent 2. My solution was to compile libevent2 and drop into /usr/local/lib then compile tmux against this lib and drop into /usr/local/bin. This works great until...I restart. This is just an assumption on my part but it seems that other binaries are now linking to the libevent2 library presumably because its on the library path. Because there are 60+ packages with libevent1 dependencies this causes my install to basically lose its mind. Is there an idiomatic way to approach running an application that has a core library dependency on a different version? Should I just statically link the lib?

    Read the article

  • Apache with mod_perl eating memory when idle

    - by syneticon-dj
    An Apache webserver running a mod_perl application is exposing abnormal memory usage - after the "day load" ceases, the system's memory is being exhausted by the Apache processes and oom_killer is being invoked. As the load returns the following morning, the memory usage normalizes - probably because Apache workers get recycled periodically if a sufficient number of hits is generated: This is the graph for apache hits per second to correlate: The remaining 2 hits per second throughout the night are induced by HAProxy checks - it runs HEAD http://mydomain.example.com/running HTTP/1.0 requests against the server every half a second with "running" being a static file (i.e. not invoking any perl code). It also seems that disabling these checks remedies the memory usage problem, but obviously cannot be a solution. All of 3 similarly configured servers (behind HAProxy) expose this behavior. The running OS is Ubuntu 10.10, Apache version 2.2.16. This seems to be a memory leak but I have no idea how to start debugging it - any hints?

    Read the article

  • How to forward traffic using iptables rules?

    - by ProbablePattern
    I am new to iptables and I have been doing Google searches for a few days now without finding a good solution to this problem. I have computer A with a public ip address (say 192.0.2.1) that can access the Internet unrestricted. I have another computer B with a private ip address (192.168.1.1) that can only access computer A. How do I use iptables to forward network traffic from B through A to the Internet? I need to use http, ftp, and https in order to use apt-get with sudo. Both computers run Ubuntu linux. I have tried using Squid but I think it is far too complicated for what I need to do.

    Read the article

  • How to attach multiple ipv6 ips to eth1 on debian

    - by Noodles
    I've just got a new server with native ipv6. I want to attach multiple ipv6 ips to eth1, but the only way I can see to do so is to attach them individually: i.e. address 2607:f0d0:xxxx:xxxx::2 address 2607:f0d0:xxxx:xxxx::3 address 2607:f0d0:xxxx:xxxx::4 Is it possible to bind whole subnets of ipv6 to a single network interface on debian? My server host tells me I have 18,446,744,073,709,551,616 ipv6 addresses for that server, surely it gets to be a nightmare to manage if they all have to bound individually (plus ifconfig would look messy). Does anyone have a solution?

    Read the article

  • Why does the Windows XP Task Manager icon disappear from the tray?

    - by Jason Owen
    One of the more annoying bugs in Windows XP is the tendency for the Task Manager icon to not show up in the tray (aka the notification area). Sometimes it does, sometimes it doesn't, and it's not consistent enough to have an obvious cause. Looking on Google turns up a bunch of forums that have the same problem but no working solution. Why does the task manager icon sometimes not show up? How can I repair it when this happens (how can make it show up when it's missing)? How can I prevent it from not working in the first place, so that I don't have to worry about repairing it after the fact?

    Read the article

  • How can I continue audio playback even after switching user?

    - by klyonrad
    I just tested it with iTunes; after switching the user account (only after logging into another account, to be precise) the audio playback from the account "A" stops. However iTunes continues playing in the background; which I realized after switching back to account "A". Very frustrating because it kind-of is a deal-breaker for me; the other person should be able to have some personalized settings; while it is still my computer, and the main account has all the music obviously. The ideal solution would be audio output continuing running while user still has the ability to manually pause it... EDIT: I tested a bit more: "Desktop" apps like VLC don't output sound but continue running; the stock Music.app in Metro pauses the music and continues playing when switching back.

    Read the article

  • Moving the home directory to a new drive

    - by Mellowcandle
    I have no more space left on my hard-drive. So I bought a new one and I would like this hard-drive to be the home folder. I thought of copying all the stuff I have on the home folder to the hard-drive partition. and creating a symbolic-link from ~ to there. The problem I have is that I can't really delete the home folder while I'm logged in as the current user. Is there a way to log out, and log in as root in Linux Mint? I want to be able to do this without a live-CD solution.

    Read the article

  • can /usr/src be a sym link?

    - by lord.didger
    I want to store all source code of programs I have installed in /usr/src. However, due to size of the drive I made a sym link /usr/src that points to ~/src. That was nice. Unfortunately, that caused virtualbox-dkms to fail to build the virtualbox kernel module because of a symlink within the linux-headers-*-common. 'script' points to ./../lib/linux-kbuild-3.1/scripts what is fine in /usr/src directory but wrong in ~/src. Can I bypass this problem or the only solution is to store sources within the directory /usr/src?

    Read the article

  • Different behaviour of script locally and over ssh

    - by neorg
    I have a script on a server-A Script-A #!/bin/bash -l echo "script-A.sh" | change-environment.sh When I ssh onto server-A and execute it, it works fine. However, when I ssh user@server-A ./script-A.sh Script-A executes, but throws an undefined variable error in change-environment.sh. change-environment.sh runs in the c shell(I have no control over the script so the method I have used is about the only way I can use it), but everything else is in bash. Had found a similar question at I can run a script locally, but cannot do "ssh HOSTNAME /path/to/script.sh". However, there was no solution to the issue and it was a year old.

    Read the article

  • Central Storage for windows user accounts homedirs .. hardware/software needed?

    - by mtkoan
    We have ~120+ users in our network, and are endeavoring to centralize logon authentication and home directory storage server-side. Most of the users are Windows 2000/XP machines, and a few running Mac OS X. Ideally the solution will be open-source-- can this all be managed from a Linux server running LDAP and Samba? Or would a hacked-NAS Box with a FreeNAS or similar suffice? Or is Micro$oft's Active Directory really the preference here. Is it viable to store PST files on this server for users to read from and write to? They are very large ~1.5gb. We have no mail server (or money) capable of Exchange or IMAP, only an old POP3. What kind of hardware horsepower and network architecture should we have for this kind of thing?

    Read the article

  • CI - How long is continous?

    - by Andy
    We currently are using CCNet as our continous integration server. Most projects check for changes every 30 seconds (the default) and if needed perform a build (unit tests, stylecop, fxcop, etc). We've gotten quite a few projects now, and the server spends most of its time near 100% cpu utilization. This has alarmed some of the development team, even though the server is responsive and builds are still about the same length of time they've always been. Its been suggested that we lower the check interval to about five minutes. To me that seems too long, and we risk people committing code and then going home for the weekend and now there's a broken build possibly holding up others. In response, the suggestion is that if someone needs to know the results they can force the build. But that seems to defeat the purpose of CI, as I thought it was supposed to be automated. My proposed solution is just to get another build server and split the builds amongst the servers. Am I thinking about this the wrong way, or is there a point where if integration isn't often enough you're not really doing CI anymore?

    Read the article

  • options for physical architecture of rails site regarding caching server or cdn

    - by timpone
    I have a rails app that is set on a single server currently. On production, I force_ssl for everything. I am interested in using a caching server for images (I'm fine with css and js being served from origin for time being). Would nginx or varnish (which I have no experience with) be a better solution (for October 2012)? I'd imagine that it would be easy to switch these around while still on this single server architecture. Or would something like cloudfront (which I also have no experience with) make sense for hosting image files? I know this is a vague question but appreciate any current feedback. thx in advance

    Read the article

  • How to get a windows domain server to recognize a linux machine by its name?

    - by CaCl
    In my company I ran into an issue where we have a linux machine that serves up a Subversion repository. Its hooked up via LDAP to the Active Directory. We got an account setup for an application and they set the Limited Workstations up so it didn't have full access to the network. The problem is that even though the hostname for our machine resolves correctly for me, the credentials for the application account seem to come back as not being allowed based on the name (the error was related to authorized workstations). I don't have access to any of the domain servers but it might be helpful to come at the management or high-level techs with some ideas, they don't seem to have a solution besides allowing all workstations for the user. Does anyone have any idea on how to get my linux machine to properly identify itself with the Domain machine by name?

    Read the article

  • My proposed design is usually worse than my colleague's - how do I get better?

    - by user151193
    I have been programming for couple of years and am generally good when it comes to fixing problems and creating small-to-medium scripts, however, I'm generally not good at designing large scale programs in object oriented way. Few questions Recently, a colleague who has same number of years of experience as me and I were working on a problem. I was working on a problem longer than him, however, he came up with a better solution and in the end we're going to use his design. This really affected me. I admit his design is better, but I wanted to come up with a design as good as his. I'm even contemplating quitting the job. Not sure why but suddenly I feel under some pressure e.g. what would juniors think of me and etc? Is it normal? Or I'm thinking a little too much into this? My job involves programming in Python. I try to read source code but how do you think I can improve me design skills? Are there any good books or software that I should study? Please enlighten me. I will really appreciate your help.

    Read the article

  • Is there a multi-user Remote Desktop app for Mac OS X?

    - by Peter Walke
    Is there a remote desktop app for the Mac that allows multiple people to be remoted in at the same time, similar to RDP in Windows? I've used VNC, but that only allows one person to control the computer. For some background: I'd like to set up a mac that many users can RDP into from PC's to do XCode development. I did some searching and didn't find anything, so I'm assuming it's just not possible, but I want to confirm. Thanks. Update: Thanks to a link in one of the answers, I found a reasonable solution: AquaConnect

    Read the article

< Previous Page | 576 577 578 579 580 581 582 583 584 585 586 587  | Next Page >