Search Results

Search found 35634 results on 1426 pages for 'internet directory'.

Page 108/1426 | < Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >

  • No internet - please help?

    - by All
    I just got Ubuntu and I am really really really a linux novice. I can't get the wireless to work? I played around with it until I saw it says "wireless firmware missing" - Hardware address 00:14:A5:6A:17:C2. On my laptop there is a button that looks like a little antenae that means the wireless is on. The funny thing is that this button does not seem to do anything right now - the light is not lit up and does not light up. Any thoughts you have on getting this going are appreciated!

    Read the article

  • Using Windows as a gateway to the internet

    - by James Wright
    My customer currently blocks outbound RDP and SSH, which means that none of their employees can get access to external Windows and Linux boxes (at the console level). However, a need has recently arisen to give access to an assortment of RDP and SSH endpoints scattered throughout the internet. The endpoint IP addresses are a moving target, and an access list exists to define what those IP addresses are. So now my customer wants to have a single Windows Server that they control as the sole outbound point for RDP/SSH to the internet. Consider it a jump box to the internet. If one of our admins have an access to this Windows box then they can log on, and from there bounce around to RDP/SSH endpoints on the internet. Is a standard Windows 2008 box going to work as a jump box? For example, I seem to recall that Win2k8 limits the number of users that can log on simultaneously, which means that the jump box may not be accessible if lots of users are on it. Advice as to how to make this work..?

    Read the article

  • Ubuntu 12.04 - PPTP VPN is the only Internet Access

    - by user212553
    I know this has been covered. I've read dozens of posts but still have questions. I have a work server whose traffic should never leave my house without encryption. The VPN is PPTP. Currently I have a cron job that checks the status of the ppp0 adapter each minute. If the connection drops, which it does fairly often, it shuts key components down. It's fairly easy to restart PPTP with "nmcli con up id 'myVPNServer'" but there's no assurance it will reconnect and I need a better way to stop traffic (other than killing apps) when ppp0 is down. The two options I've seen discussed are the firewall (UFW, Firestarter, IPTables) or the route tables. I could be easily swayed to consider the firewall option but I focused on the route tables since no new function needs to be started. My questions involve the way the route tables change and then specifics on rules. When I start the PPTP VPN the route tables change. That suggests that if the VPN drops, the table will change back, defeating my stated intent of preventing external traffic. How can I make "sticky" changes to the route table that will persist even if the VPN connection drops? Perhaps the check boxes "Ignore automatically obtained routes" or "Use this connection only for resources on it's network" (which are part of the VPN configuration options)? It would seem that, if I can force the active VPN route table to stay in effect, even when the VPN drops, that this will effectively kill any external traffic should the VPN drop. This will give me the latitude to run a routine to restart the VPN from the command line (assuming the route table rules don't prevent me re-establishing the connection). My route table, with the VPN active is (ip route list): Any comments on what 10.10.1.1 is? $ ip route list default dev ppp0 proto static 10.10.1.1 dev ppp0 proto kernel scope link src 10.10.1.11 VPN_Server_IP_Address via 192.168.1.1 dev eth0 proto static VPN_Server_IP_Address via 192.168.1.1 dev eth0 src 192.168.1.60 169.254.0.0/16 dev eth0 scope link metric 1000 192.168.1.0/24 dev eth0 proto kernel scope link src 192.168.1.60 metric 1

    Read the article

  • SEO is Still the Key to Your Internet Marketing Success

    After many years, Search Engine Optimization (SEO) is still the key to creating long term brand awareness, online visibility and attracting increased traffic to your website, enabling you to appear higher in the organic search rankings for a set of targeted, high value keyword phrases. SEO is a set of activities that will provide your website with higher Relevance and Authority, which the search engines use to determine how high your website should rank when a specific keyword phrase is searched for.

    Read the article

  • .NET Rocks! Internet Audio

    - by Editor
    NET Rocks! is a weekly talk show for anyone interested in programming on the Microsoft .NET platform. The shows range from introductory information to hardcore geekiness. Many of their listeners download the MP3 files and burn CDs for the commute to and from work, or simply listen on a portable media player.  Download .NET Rocks! audio.

    Read the article

  • Just wondering about "Do-It Yourself Apps" on the internet versus apps written by us developers

    - by user657514
    Hi, I have been doing Objective-C programming over the past few weeks, and I have learnt a lot. However, I see that there are other Web-companies offering services to consumers directly from their website that allow consumers to create their apps through a point and click and drag features without any code. Clearly they are more cost effective and fast than having a developer write an app. I was wondering if there are any advantages then of having a developer build an app for someone, other than the obvious advantage that its got a custom look and feel. Could someone please clarify, since Im new and would like to evaluate whether it is worthwhile spending time towards learning a whole new development environment when someone could just use a webservice to make an app for multiple platforms Thanks

    Read the article

  • Directory Synchronization

    - by Robert May
    We’re using federated security with Office 365 and everything was running swimmingly and then I started getting the following error when trying to synchronize security information: “An unknown error occurred with the Microsoft Online Services Sign-in Assistant. Contact Technical Support.” Great.  Very descriptive.  In the event viewer, you get a bit more detail: GetAuthState() failed with -2147186688 state. HResult:0. Contact Technical Support.  (0x80048831) If you do some searching, you’ll find that there are a couple of MSDN articles about this error.  In KB2502710 you’re told to reinstall sign in assistant.  This one requires a reboot.  In KB2517393 you’re told to make sure that your proxy settings are working correctly.  I’m not using a proxy and everything was set up right. Rather frustrating and I couldn’t figure out what was going on.  What finally keyed me in was the error number being presented.  Rather than 80048800, which is listed in the second article, I was getting 80048831.  I did a quick search and found something that was seemingly unrelated here.  Could it really be so simple as the password having expired for my synchronization user? Turns out, it was that simple.  Once the password was reset and reentered, everything worked great again. Since this isn’t a user that humans use, I also don’t want the password to expire.  You can find the instructions for that (use Set-MsolUser –UserPrincipalName <user ID> –PasswordNeverExpires $true) here. Technorati Tags: Office 365

    Read the article

  • Installing a directory with a Debian Package

    - by Meisie
    Hi guys I want to create a Debian Package that installs a bunch of Folders to a system but I can't get it working. The Package gets created without any errors and lintian also says it's okay but installing does nothing. The rules file looks like this: <#>!/usr/bin/make -f logs = $(CURDIR)/shell_logs/ DEST1 = /opt/Pacetutor/ build: build-stamp build-stamp: dh_testdir touch build-stam clean: dh_testdir dh_testroot rm -f build-stamp dh_clean install: build clean $(logs) dh_testdir dh_testroot dh_prep dh_installdirs mkdir -m 755 -p $(DEST1) <- this is propably optional or not needed -> cp -r $(logs) $(DEST1) <- using mv works but thats not what I want. -> binary-indep: build install dh_testdir dh_testroot dh_installchangelogs dh_installdocs dh_installexamples dh_installman dh_link dh_compress dh_fixperms dh_installdeb dh_gencontrol dh_md5sums dh_builddeb binary-arch: build install binary: binary-indep binary-arch .PHONY: build clean binary-indep binary-arch binary install

    Read the article

  • How would I isolate one netowrked PC to LAN only?

    - by itsraine
    I would like to have one of my PCs available to the rest of my home network for file sharing and VNC access, but I want to block any Internet traffic going to and from the PC. In other words, I want all local PCs I have connected to the router functioning as any normal LAN would, but when it comes to the Internet I want one particular PC to be "safe" from the Internet. My guess is that this is some sort of port blocking or some other router function, but I'm not quite sure.

    Read the article

  • "Index of ..." directory's files listing

    - by Tony
    On my courses we've got homework on site in folders such as: http://example.com/files/tasks1-edc34rtgfds http://example.com/files/tasks2-0bg454fgerg http://example.com/files/tasks3-h1dlkjiojo8 ... Each tasksi-xxxxxxxxxxx is a folder with 11 random characters at the end. And when you view the above URLs in browser you can see Index of /tasksi-xxxxxxxxx with all the files in that folder. When you view http://example.com/files/ you can see only empty html with words "Hello, world". The problem is that you can't look into the next task without knowledge of its URL. So for example we've got the URLs for tasks1 and tasks2, and we can't guess what tasks3 URL will be (as we need to know the 11 random characters at the end) How can I get the list of all directories? (Is there a way to type something like http://example.com/files/task1-aflafjal343/..? or another way?) I want to see all upcoming homework tasks.

    Read the article

  • Installed Ubuntu on a macbook pro and am running into errors with internet

    - by user209270
    I recently installed Ubuntu (64 bit, 12.04) on a mid-2012 Macbook Pro, and I've been having some really annoying issues. Chief among them is that my wireless doesnt seem to function properly. I've installed Ubuntu on this mac before, and again this time the drivers for the wireless card were missing. I, using something along the lines of sudo su apt-get update apt-get upgrade apt-get b43-fwcutter firmware-b43-installer A new problem I've been experiencing however is that none of the networks available in my home show up. I've been able to connect theto themm by going into "Connect to hidden networks" and typing in the network name and password, but when I restart the computer my network (let's just name it "Network1"), doesnt show up. I tried going into "Edit Connections" and it shows up in the wireless tab, but it is never connected to automatically or shows up in the drop-down menu. Any help?

    Read the article

  • Achieve Your Internet Marketing Goals With SEO

    Web promoting SEO has often established itself to be the key that changes little sites into profit belching creatures, and that's the truth. I have watched it occur with my very own eyes. I accept that net marketing SEO isn't precisely straightforward however its not that difficult of a task for somebody who is aware of and comprehends how search engines and their ranking principles work.

    Read the article

  • problem in connecting Reliance booadband +

    - by Athira R
    I tried connceting reliance broadband in ubuntu. then an autorun prompt will come. when i click install another pop up coming saying autorun file not found. And when i checked in Internet connection to add connection. Reliance broadband option iteself is not there as connected device. This is Athira. I could not find any add comment option on my main question. So i created a new account and adding as new answer to ur question. athira@athira-laptop:~$ sudo lsusb Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 004: ID 12d1:1505 Huawei Technologies Co., Ltd. Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 002: ID 08ff:1600 AuthenTec, Inc. AES1600 Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 003 Device 002: ID 0421:0508 Nokia Mobile Phones E65 (PC Suite mode) Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 002 Device 002: ID 064e:a103 Suyin Corp. Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub athira@athira-laptop:~$ i got this after running it in ubuntu 10.04

    Read the article

  • Web Directory Submission - Important Part in Link Building

    Submitting to web directories is a vital part of every link building strategy. Apart from driving traffic to your website via direct recommendations, web directories offer static, one way links to your website, boosting your link popularity and improving your rankings on the major search engines. Search engine optimization has started turning submission to directories and articles to its advantage.

    Read the article

  • Restricting A Directory Through .htaccess

    - by Whitechapel
    I'm trying to put all of my FTP accounts into a folder on /public_html/ftp and password protect it so search bots can't crawl their private files. I'm also trying to redirect all site traffic from the non-www to www. I keep getting 500 errors when accessing the site, and I need to point it to www.vivalanation.com/ftp to www.vivalanation.com/ftp/, because the /ftp just errors out, you need the trailing slash. Here is my .htaccess in the /public_html/ftp folder: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L] AuthName "FTP Access" AuthType Basic AuthUserFile /home1/vivalst/.htpasswds/public_html/ftp/passwd Require valid-user I created a passwd file in /.htpasswds/public_html/ftp And here is my basic .htaccess in the root of /public_html/: RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} !^www\. [NC] RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]

    Read the article

  • Halloween: Season for Java Embedded Internet of Spooky Things (IoST) (Part 2)

    - by hinkmond
    To start out our ghost hunting here at the Oracle Santa Clara campus office, we first need a ghost sensor. It's pretty easy to build one, since all we need to do is to create a circuit that can detect small fluctuations in the electromagnetic field, just like the fluctuations that ghosts cause when they pass by... Naturally, right? So, we build a static charge sensor and will use a Java Embedded app to monitor for changes in the sensor value, running analytics using Java technology on a Raspberry Pi. Bob's your uncle, and there you have it: a ghost sensor. See: Ghost Detector So, go out to Radio Shack and buy up these items: shopping list: 1 - NTE312 JFET N-channel transistor (this is in place of the MPF-102) 1 - Set of Jumper Wires 1 - LED 1 - 300 ohm resistor 1 - set of header pins Then, grab a flashlight, your Raspberry Pi, and come back here for more instructions... Don't be afraid... Yet. Hinkmond

    Read the article

  • Website and file/directory permissions

    - by mathiass
    I've been given a task to fix this one website. One of its issues is that on one page, the images have broken links - the images are not showing, and clicking on the image (i.e. direct link to the image file) results in a 403 (Forbidden) error. I am looking for some feedback on what could be the possible cause. The directory where the images are stored has the following permissions: drwxrws--- www "group" 10240 Aug 2008 "image directory name" I had to hide the names. I checked the page source code, and everything seems to be in place. The rest of the site, and other images outside that image directory are showing fine. I was told that recently there have been some changes to the server. I'm trying to assume that there is no fault in the source code, and the permissions are - or used to be - correct (since the site has been working before, and no recent changes to the site itself have been made). My only thoughts at the moment is that either: a) the directory permission should be: drwxrws--x (executable) for the other users, or b) there is a change in the server settings that I don't know of. Is there anything else I should check?

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112 113 114 115  | Next Page >