Search Results

Search found 5969 results on 239 pages for 'seo man'.

Page 180/239 | < Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >

  • Cannot mount a CIFS network share on Ubuntu over VPN

    - by Aron Rotteveel
    I have setup u VPN connection to our Windows 2008 server at the office and it seems to work fine. For some reason, however, I still am not able to access the network shares over a VPN connection using my standard fstab entries. When I am physically connected to the network, it works fine, but now when trying this over VPN I get the following error: mount error(110): Connection timed out Refer to the mount.cifs(8) manual page (e.g. man mount.cifs) My /etc/fstab looks like this: //server2008/share /mnt/share cifs iocharset=utf8,credentials=/home/aron/.smbcredentials,uid=1000 0 0 As said, it works fine when physically connected, but over VPN it just wont work. Any help is appreciated.

    Read the article

  • Move site from one tld to another

    - by Amol Ghotankar
    If we want to move site from say xyz.com to xyz.org. What all things we need to do to make sure seo works fine. I am doing something like Point both xyz.com and xyz.org to same ip where my site is working Use cannonical url to have xyz.org/* instead of xyz.com/* Add site to webmaster and make a change request. But problem is we are not able to 301 redirect from xyz.com to xyz.org as both are on same i/p and doing so is causing redirect loop and error. How to fix this? Please help.

    Read the article

  • 12.10 cifs shares not mounting after modifying /etc/fstab

    - by Doug
    From Ubuntu 11.04 to 12.04 i've been able to mount my nas shares by first making directories in the /media folder, then : sudo gedit /etc/fstab to include the following line for each share I want to auto-mount; //servername/sharename /media/windowsshare cifs guest,uid=1000,iocharset=utf8,codepage=unicode,unicode 0 0 Now, however, I upgraded to 12.10, and suddenly I'm not able to mount the shares after saving /etc/fstab and sudo mount -a, giving me this error: Refer to the mount.cifs(8) manual page (e.g. man mount.cifs) mount error(22): Invalid argument When in Nautilus, The shares are visible under the network tab, unmounted, and when I click on a share, I get the following message: mount: only root can mount //192.168.1.71/photos on /media/photos I checked to ensure smbfs was installed, and no problems there. I'm stumped.

    Read the article

  • Understanding the Java Ecosystem

    - by syrion
    I have traditionally had the "luxury" of being a one-man development team. I've used Python extensively, have a reasonable command of Perl, PHP, and JavaScript. My problem is Java. I can write Java code. I'm not great at it--unlike Python, I rarely make use of anything unique to Java when I'm writing it. Furthermore, my experience is mostly in simple GUI/console programming. Unfortunately, I'm currently pursuing an IT degree where Java is the lingua franca. My database class is requiring that our projects be written in Java using servlets, and I just can't wrap my head around the ecosystem. Is there a good online overview of or tutorial on how the Java web ecosystem works? I have Thinking in Java, but it's mostly just the language itself (which I understand well enough to get by). I have looked at the Sun servlet tutorial, but it seems outdated.

    Read the article

  • Problem in Kubuntu: Default browser never opens

    - by user170852
    I'm using Kubuntu, and I switched the default browser to my favourite one. The thing is, when I did this, both my Twitter clients stopped working. Now each and every Twitter client I download and install fails to open any web browser to finish my account authorization. It's impossible they're all broken, so I reset my default browser to rekonq and it works well as a default browser, but still can't login to twitter from any client. In fact, every button in every program that should open a browser window (like Amarok with the "like on last.fm" button) does nothing. So I think there may be a problem with my system but I can't figure out what is it. My user is not in the admin group (it's a shared user), I have an admin user but I only use it to install programs. Could that be the cause? Also, hitting alt+F2 and writing man:file (enter) opens a rekonq window normally.

    Read the article

  • How many developers before continuous integration becomes effective for us?

    - by Carnotaurus
    There is an overhead associated with continuous integration, e.g., set up, re-training, awareness activities, stoppage to fix "bugs" that turn out to be data issues, enforced separation of concerns programming styles, etc. At what point does continuous integration pay for itself? EDIT: These were my findings The set-up was CruiseControl.Net with Nant, reading from VSS or TFS. Here are a few reasons for failure, which have nothing to do with the setup: Cost of investigation: The time spent investigating whether a red light is due a genuine logical inconsistency in the code, data quality, or another source such as an infrastructure problem (e.g., a network issue, a timeout reading from source control, third party server is down, etc., etc.) Political costs over infrastructure: I considered performing an "infrastructure" check for each method in the test run. I had no solution to the timeout except to replace the build server. Red tape got in the way and there was no server replacement. Cost of fixing unit tests: A red light due to a data quality issue could be an indicator of a badly written unit test. So, data dependent unit tests were re-written to reduce the likelihood of a red light due to bad data. In many cases, necessary data was inserted into the test environment to be able to accurately run its unit tests. It makes sense to say that by making the data more robust then the test becomes more robust if it is dependent on this data. Of course, this worked well! Cost of coverage, i.e., writing unit tests for already existing code: There was the problem of unit test coverage. There were thousands of methods that had no unit tests. So, a sizeable amount of man days would be needed to create those. As this would be too difficult to provide a business case, it was decided that unit tests would be used for any new public method going forward. Those that did not have a unit test were termed 'potentially infra red'. An intestesting point here is that static methods were a moot point in how it would be possible to uniquely determine how a specific static method had failed. Cost of bespoke releases: Nant scripts only go so far. They are not that useful for, say, CMS dependent builds for EPiServer, CMS, or any UI oriented database deployment. These are the types of issues that occured on the build server for hourly test runs and overnight QA builds. I entertain that these to be unnecessary as a build master can perform these tasks manually at the time of release, esp., with a one man band and a small build. So, single step builds have not justified use of CI in my experience. What about the more complex, multistep builds? These can be a pain to build, especially without a Nant script. So, even having created one, these were no more successful. The costs of fixing the red light issues outweighed the benefits. Eventually, developers lost interest and questioned the validity of the red light. Having given it a fair try, I believe that CI is expensive and there is a lot of working around the edges instead of just getting the job done. It's more cost effective to employ experienced developers who do not make a mess of large projects than introduce and maintain an alarm system. This is the case even if those developers leave. It doesn't matter if a good developer leaves because processes that he follows would ensure that he writes requirement specs, design specs, sticks to the coding guidelines, and comments his code so that it is readable. All this is reviewed. If this is not happening then his team leader is not doing his job, which should be picked up by his manager and so on. For CI to work, it is not enough to just write unit tests, attempt to maintain full coverage, and ensure a working infrastructure for sizable systems. The bottom line: One might question whether fixing as many bugs before release is even desirable from a business prespective. CI involves a lot of work to capture a handful of bugs that the customer could identify in UAT or the company could get paid for fixing as part of a client service agreement when the warranty period expires anyway.

    Read the article

  • How can I ensure our project gets developed successfully, without having any project management experience? [migrated]

    - by Raven13
    I'm a web developer who is part of a three-man team that has been tasked with a rather large and complex development project. Other than some direction and impetus from management, we're pretty much on our own to develop the new website. None of us have any project management experience nor do my two coworkers seem like they would be interested in taking on that role, so I feel like it's up to me to implement some kind of structure to the development process in order to avoid issues down the road. What can I do as a developer without project management experience to ensure that our project gets developed successfully and avoid the pitfalls of developing a project without a plan?

    Read the article

  • Does Webmaster Tools list traffic from ads as inbound links?

    - by Mohamad
    In Webmaster Tools, under the inbound links section, do ads get counted as inbound links? I am doing a review of inbound links on a website and found that most of them are sourced from meaningless blogs and spam websites. Before I accuse anyone of not doing their job properly, I would like to know something: Is it possible that those inbound links were generated when an ad for the website appeared on the spam website? An SEO firm was paid handsomly to generate inbound links and I am afraid all they did was submit material to spam blogs and websites.

    Read the article

  • Movie Posters Revised as 8-Bit Masterpieces

    - by Jason Fitzpatrick
    If you like your movie posters to look a little more like Pac-Man and a little less like polished photography then this roundup of 8-bit movie posters is for you. Star Wars, Office Space, Kill Bill, 300, you’ll find all sorts of movie posters envisioned as 8-bit adventures om Eric Palmer’s gallery of 8-bit creations. 8-Bit Movie Posters [via Neatorama] How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows? Screenshot Tour: XBMC 11 Eden Rocks Improved iOS Support, AirPlay, and Even a Custom XBMC OS

    Read the article

  • Quality Backlinks - A Key to Search Engine Optimization

    Backlinks are the links which are going to your blogs, sites or articles. Backlinks are the most important and single very significant factors to give the page rank to your site or blogs.They are great technique and way to find a proper and a decent place in the goggle or any of the major search engines. There are many other aspects of SEO but quality back links are the most appropriate way to find a great way in terms of search engine optimization. Now it is the time to take a look at the components which are very important about backlinks.

    Read the article

  • How do I set PATH variables for all users on a server?

    - by Rob S.
    I just finished installing LaTeX for my company's Ubuntu server that we all SSH into to use. At the end of the install it says this: Add /usr/local/texlive/2010/texmf/doc/man to MANPATH, if not dynamically determined. Add /usr/local/texlive/2010/texmf/doc/info to INFOPATH. Most importantly, add /usr/local/texlive/2010/bin/x86_64-linux to your PATH for current and future sessions. So, my question is simply: How do I do this so that these variables are set for all users on the system? (And yes, I have sudo permissions). Thanks in advance to any and all responses I receive.

    Read the article

  • Can I use Ubuntu Server to replace our Windows environment?

    - by Aaron English
    I have recently been put in charge of a network overhaul for our company. I have done plenty with Ubuntu in school but it has been a few years. I would like to replace our current servers with Ubuntu, although I am unaware if it will work. Our current environment runs a Domain, Exchange, and VPN. I know there are solutions capable for this. I guess my man worry is will windows 7 and windows XP be able to use Ubuntu as a Domain Controller? If anyone has had success with this I would love some input. I have a meeting in a couple months that I am suppose to explain our plan. Thank you.

    Read the article

  • Blog not even ranking for exact title match, after domain has been dropped twice [on hold]

    - by Akshay Hallur
    Consider a blog, related to blogging and SEO. The domain has been dropped (expired) 2 times before acquisition. The current owner is the 3rd owner of the domain since 5 months. Blog posts are not ranking, even for exact titles. Google+ or other shares will show up instead of the content. Some blog posts are not even indexed. Let us TAKE that it gets around 7 organic visits / day. Dropped domain, less likely used for spam (WayBack machine (2 Reframed drops) 3 captures since 2004, Don't know whether there was Email spam) (But no manual actions in WMT, so no reconsideration request). What could be the reason for this? How can Google be told that ownership is changed and the domain is now spam-free? Would this domain be salvageble, or does this only change after relocating to another domain?

    Read the article

  • Should Site Title be Before or After Page Title?

    - by NickAldwin
    Apologies if this is a dupe. I tried searching, but didn't find anything specifically addressing this concern. When creating a large(ish) site, page titles usually reference both the site name and the current page name. However, it seems there are two main conventions: Bob's Awesome Site - Contact Page and Contact Page - Bob's Awesome Site I've looked around, and pages usually use one of the two variants above. Is there any reason to use one over the other? SEO/readability/usability/etc? I've thought about it, and have only come up with: Page first - Differentiates the tab when the browser is crowded with lots of tabs Site first - Immediately see the "parent" site, so to speak; more cohesive experience

    Read the article

  • Does using structure data semantic LocalBusiness schema markup work for local EMD URL's?

    - by ElHaix
    Based on what I have read about Google's recent Panda and Penguin updates, I'm getting the impression that using semantic markup may help improve SEO results. On a EMD (exact match domain) site, that may have been hit, we list location-based products. We are now going to be adding a itemtype="http://schema.org/Product" to each product, with relevant details. However, that product may be available in Los Angeles and also in appear in a Seattle results page. We could add a LocalBusiness item type on each geo page to define the geo location for that page. While the definition states: A particular physical business or branch of an organization. Examples of LocalBusiness include a restaurant, a particular branch of a restaurant chain, a branch of a bank, a medical practice, a club, a bowling alley, etc. We could add use the location property which would simply include the city/state details. I realize that this looks like it is meant for a physical location, however could this be done without seeming black-hat?

    Read the article

  • How to properly remove URL's from Google's index?

    - by ElHaix
    On some of our sites, we now have several thousand pages that dilute our website's keyword density. The website is an MVC site with SEO routing. If I submit a new sitemap with say only the 2000 or so pages that we want indexed, even though navigating to the diluting pages still works, will Google re-index the site with only those 2000 pages, dropping the superfluous ones? For example, I want to keep roughly 2000 of the following: www.mysite.com/some-search-term-1/some-good-keywords www.mysite.com/some-search-term-2/some-more-good-keywords And remove several thousand of the following that have already been indexed. www.mysite.com/some-search-term-xx/some-poor-keywords www.mysite.com/some-search-term-xx/some-poor-more-keywords These pages are not actually "removed" as navigating to these URL's still renders a page. Even though there are potentially hundreds of thousands of pages, I only want say 2000 to be re-indexed and retained. The others removed (without having to do these manually). Thanks.

    Read the article

  • Dynamic MMap ran out of room when trying to sudo apt-get anything

    - by user1610406
    I was having an error in Update Manager that asks me to do a partial upgrade and it fails. Now I can't sudo apt-get install anything. I tried to fix it, and now I can't sudo apt-get anything. Every time, I get this output: Reading package lists... Error! E: Dynamic MMap ran out of room. Please increase the size of APT::Cache-Limit. Current value: 25165824. (man 5 apt.conf) E: Error occurred while processing libuptimed0 (NewVersion1) E: Problem with MergeList /var/lib/apt/lists/archive.ubuntu.com_ubuntu_dists_lucid_universe_binary-i386_Packages W: Unable to munmap E: The package lists or status file could not be parsed or opened. I have no idea why this is happening or how to fix it, and I fear that if I try something that probably doesn't work that it will make my problem worse. (Just for reference I am currently running 10.04 (Lucid) on my machine.)

    Read the article

  • Why Is Another Domain Resolving To My IP Address?

    - by Andrew
    I'm not really sure if this is something that I should worry about... I'm currently renting a dedicated server which is hosting a website I've created. The domain of the website was registered with GoDaddy. After submitting a sitemap to Google several months ago, I've noticed that another domain name is resolving to my IP address. This means that every page on my website is actually accessible from another domain. As far as I can tell, the other domain name is meaningless to me, so I'm not sure if this is something I should worry about or not. Is this a residual DNS record from another site that is probably no longer in use? Is it important from the standpoint of either security or SEO? My website is a .com which will later serve e-commerce purposes. The other domain has a top-level domain of st. It's the first one of those that I've encountered. Many thanks in advance!

    Read the article

  • Caption Competition 3: Caption With a Vengeance

    - by Simple-Talk Editorial Team
    Please to be informing us what might be going on here. Anything faintly computer-themed will always help, but being funny is more important. The one that raises the most chuckles from our team of professional miseryguts’ will win a $50 Amazon voucher. Get entries in before 5 p.m. UK time on the 30th of May to be eligible.  As ever, some suggestions to get you started: He didn’t know how developers kept getting into the server room, but by jove they wouldn’t get out again. Every time you build straight to production, it’s ten minutes with the bees. I know management’s resistant to the cloud, but was burying the IT department this far underground really necessary? After weeks of hunting, a group of highly trained Azure specialists capture the man responsible for branding.

    Read the article

  • How Often do You Change E-mail Addresses? [Poll]

    - by Asian Angel
    Recently we ran across an article about a man who consistently changes his e-mail address every 20 months. Why? To throw off spam. With that in mind we became curious and decided to ask how often you change your e-mail addresses… Everyone has their own method for dealing with the bane known as spam whether it is heavy filtering, separate accounts to catch possible spam activity, abandoning swamped accounts, etc. Here is your opportunity to share how you deal with spam, protect your accounts, and to voice your thoughts regarding consistent timed changes to new accounts as mentioned in the article linked to below. How-To Geek Polls require Javascript. Please Click Here to View the Poll. How Frequently Do You Change Your Email Address? [Apartment Therapy] 6 Ways Windows 8 Is More Secure Than Windows 7 HTG Explains: Why It’s Good That Your Computer’s RAM Is Full 10 Awesome Improvements For Desktop Users in Windows 8

    Read the article

  • Possible problems in a team of programmers [on hold]

    - by John
    I am a "one man team" ASP.NET C#, SQL, HTML, JQuery programmer that wants to split workload with two other guys. Since I never actually thought of possible issue in a team of programmer, there are actually quite a few that came to my mind. delegating tasks (who works on what which is also very much related to security). I found Team Foundation Service could be helpful with this problem and started reading about it. Are there any alternatives? security (do now want for original code to be reused outside the project) How to prevent programmers from having access to all parts of code, and how to prevent them from using that code outside of project? Is trust or contract the only way?

    Read the article

  • How to create a reasonably sized urban area manually but efficiently

    - by Overv
    I have a game concept that only really works in an urban area that is of reasonable scale and diversity. In terms of what it should look like, think GTA, in terms of the size think more like a small neighbourhood with residents and a few local shops, perhaps a supermarket. I'm mostly experienced in programming and not at all with modelling, texturing or drawing, but I've found that SketchUp allows me to design interesting looking buildings that I model after real world buildings in my own neighbourhood. Designing these buildings and other objects can take from a few tens of minutes to a few hours. My question is: what is the best approach for a one man army like me who does manage to model buildings to create an interesting city environment in a reasonable amount of time? My game will not be based on procedural generation, the environment will actually be modelled like GTA cities.

    Read the article

  • Time out while mounting samba share

    - by nullDev
    I am trying to mount a hard-disk connected to my WDTV Live box. The following command smbclient -L 192.168.1.2 -U guest gives the following output: Domain=[WORKGROUP] OS=[Unix] Server=[Samba 3.5.1] Sharename Type Comment --------- ---- ------- Expansion_Drive Disk Expansion_Drive MICROVAULT Disk MICROVAULT IPC$ IPC IPC Service (WDTV LIVE) Domain=[WORKGROUP] OS=[Unix] Server=[Samba 3.5.1] Server Comment --------- ------- WDTVLIVE WDTV LIVE Workgroup Master --------- ------- WORKGROUP But if I try sudo smbmount //WDTVLIVE/Expansion_Drive /home/ashish/wdtvlive/ -o guest,rw I get the following: Warning: mapping 'guest' to 'guest,sec=none' mount error(110): Connection timed out Refer to the mount.cifs(8) manual page (e.g. man mount.cifs) I am able to browse and mount through Nautilus as well, but I dont want the drive to be mounted at gvfs.

    Read the article

  • How many lines of code can a C# developer produce per month?

    - by lox
    An executive at my workplace asked me and my group of developers the question: How many lines of code can a C# developer produce per month? An old system was to be ported to C# and he would like this measure as part of the project planning. From some (apparently creditable) source he had the answer of "10 SLOC/month" but he was not happy with that. The group agreed that this was nearly impossible to specify because it would depend on a long list of circumstances. But we could tell that the man would not leave (or be very disappointed in us) if we did not come up with an answer suiting him better. So he left with the many times better answer of "10 SLOC/day" Can this question be answered? (offhand or even with some analysis)

    Read the article

  • Terminator Skull Crafted from Dollar Store Parts [Video]

    - by Jason Fitzpatrick
    Earlier this year we shared an Iron Man prop build made from Dollar Store parts. The same Dollar Store tinker is at it again, this time building a Terminator endoskull. James Bruton has a sort of mad tinker knack for finding odds and ends at the Dollar Store and mashing them together into novel creations. In the video below, he shows how he took a pile of random junk from the store (plastic bowls, cheap computer speakers, even the packaging the junk came in) and turned it into a surprisingly polished Terminator skull. Hit up the link below for the build in photo-tutorial format. Dollar Store Terminator Endoskull Build [via Make] How to Banish Duplicate Photos with VisiPic How to Make Your Laptop Choose a Wired Connection Instead of Wireless HTG Explains: What Is Two-Factor Authentication and Should I Be Using It?

    Read the article

< Previous Page | 176 177 178 179 180 181 182 183 184 185 186 187  | Next Page >