Search Results

Search found 13195 results on 528 pages for 'technical trainer pro'.

Page 233/528 | < Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >

  • Web host recommendation [closed]

    - by birdus
    Possible Duplicate: How to find web hosting that meets my requirements? I'm researching a web host for a client and am looking for any recommendations of hosts you may have used and been happy with. Here are the requirements I've been given: The hosting service needs to either provide or allow us to add the following functionality: i. ASP/ASP.Net ii. video streaming iii. audio streaming iv. reporting v. RSS feeds vi. site search vii. forums viii. podcasts ix. Flash x. CMS: looking at using Percussion Software xi. PII registration xii. tie into SF.com (Sales Force) They also want to have a pre-prod server available so they can test the website before going public with it. This may just be a matter of paying extra for another site/server. Thanks for the help.

    Read the article

  • What are the recommended minimum payment options on an e-commerce site?

    - by Mantorok
    I've recently released a site that presently integrates with PayPal for taking payments, this doesn't require you to hold a PayPal account as you can submit credit cards through the PayPal checkout without having to sign-up etc. But what other options would you say were recommended or perhaps even required to ensure you capture as many potential customers as possible? EDIT: We accept payments worldwide by the way.

    Read the article

  • How to determine if someone is accessing our database remotely?

    - by Vednor
    I own a content publishing website developed using CakePHP(tm) v 2.1.2 and 5.1.63 MySQL. It was developed by a freelance developer who kept remote access to the database which I wasn’t aware of. One day he accessed to the site and overwrote all the data. After the attack, my hosting provider disabled the remote access to our database and changed the password. But somehow he accessed the site database again and overwrote some information. We’ve managed to stop the attack second time by taking the site down immediately. But now we’re suspecting that he’ll attack again. What we could identified that he’s running a query and changing every information from the database in matter of a sec. Is there any possible way to detect the way he’s accessing our database without remote access or knowing our Cpanel password? Or to identify whether he has left something inside the site that granting him access to our database?

    Read the article

  • Identical spam coming from many different (but similar) IP addresses

    - by DisgruntledGoat
    A forum I run has been the victim of spam user accounts recently - several accounts that have been registered and the profile fill with advertising/links. All of this is for the same company, or group of companies. I deleted several accounts weeks ago and blocked some IP addresses, but today they have come back with the same spam. Every account has a different IP address, but they are all of the form 122.179.*.* or 122.169.*.*. I am considering blocking those two IP ranges, but there are potentially thousands of IPs in that range. They appear to be assigned to India (although the spam is for an American company) so given the site is for a western, English-speaking audience maybe it doesn't matter. My questions: How are they posting on so many IPs? Is there likely to be a limit to the number of IPs they have access to? Is there anything else I can do at the IP-level to block them? (I am looking into other measures like blocking usernames/links.)

    Read the article

  • SEO: Getting site to show in location-specific searches

    - by willvv
    I'm really new to this SEO world and I've been reading a lot to try and figure it out. We have a site moodbond.com that allows users to browse/create events anywhere. And we fill it with content from the main cities in the US. We would like it to show for searches for things like "events in san francisco" or "what to do in new york", however, since the site is not really location-specific, I'm not really sure where to begin. I've been thinking a couple of things, maybe you can help me decide if these would be a good way to start or if I should try something different. 1- Allow something like location-specific urls (e.g. moodbond.com/browse/san-francisco) could just show the main page centered in San Francisco. 2- Change the headers/title of the page so it adapts automatically to the city being browsed (and change this dynamically as the user changes the location of the map). 3- Add internal links to different locations (e.g. add a link at the footer of the page that says "Events in Seattle" that makes the site load events in that city. (this would probably depend on implementing #1). What do you guys think? will any of these really help or should I look for a different approach? any advice is welcome. Thanks

    Read the article

  • Could crosslinking using very general anchor texts be a reason for a drop in rankings?

    - by webmasters
    I have crosslinked 20 sites and I thought I have been penalized for this, asked this question and some experienced members told me maybe that crosslinking may not necessarily be the reason. The sites are on same host, different C class IP and every site in linked to each other. Each site targets long tail kewords. Site 1 - BMW Used Cars - and my area Site 2 - WW Used Cars - and my area And so on... When I crosslinked them (in the sidebar), I did it for the users; instead of repeating the terms used cars and my location over and over (since my users are targeted) I just crosslinked them using the brand: BMW, WW. Targeting locally, my niches are not overly competitive, so I did not need to many external links to rank on various positions on the 1st page. I'm thinking that when I chose to link using only the brand, google might have thought I wanted to actually rank for BBW and WW, hence the drop in my targeted local traffic. Could this be? I now have no-followed the links and I am noticing a slight recovery, but if it's not a interlinking penalty it would be a shame not to benefit from my links.

    Read the article

  • Getting IP/WHOIS information from users clicking ads in Google AdWords?

    - by nctrnl
    I am running a Google AdWords campaign and am wondering if I can get any information from the people who click my ads? I already collect statistics on the website, but I get no referer information from the Google ads in the search network, only from the display network. I am also running Google Analytics on the website but I can't find any relevant information there. P.S: Tag suggestion: google-adwords

    Read the article

  • SEO, IIS 7 and web.config in subfolder issue

    - by tesicg
    We have ASP.NET application that has sub-folder with .aspx pages and separate web.config file in it. The .aspx pages in that sub-folder behave as separate site. In the web.config file at application level, I set the rule that removing trailing slashes: <rewrite> <rules> <rule name="RemoveTrailingSlashRule1" stopProcessing="true"> <match url="(.*)/$" /> <conditions> <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" /> <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" /> </conditions> <action type="Redirect" redirectType="Permanent" url="{R:1}" /> </rule> </rules> </rewrite> I expect this rule will propagate downward to sub-folder as well. To access the site in sub-folder we should type: http://concert.local/elki/ and get it without trailing slash as: http://concert.local/elki But, the trailing slash remains. The web.config file in sub-folder looks as following: <configuration> <system.webServer> <defaultDocument> <files> <add value="Sections.aspx" /> </files> </defaultDocument> </system.webServer> </configuration>

    Read the article

  • URL slugs: ideal length, and the real SEO effects of these slugs

    - by tattvamasi
    this question is addressed widely on SO and outside it, but for some reason, instead of taking it as a good load of great advice, all this information is confusing me. ** Problem ** I already had, on one of my sites, "prettified" urls. I had taken out the query strings, rewritten the URLS, and the link was short enough for me, but had a problem: the ID of the item or post in the URL isn't good for users. One of the users asked is there's a way to get rid of numbers, and I thought it was better for users to just see a clue of the page content in the URL. ** Solution ** With this in mind, I am trying with a section of the site.Armed with 301 redirects, some parsing work, and a lot of patience, I have added the URL slugs to some blog entries, and the slug of the URL reports the title of the article (something close to http://example.com/my-news/terribly-boring-and-long-url-that-replaces-the-number-I-liked-so-much/ ** Problems after Solution ** The problem, as I see it, is that now the URL of those blog articles is very descriptive for sure, but it is also impossible to remember. So, this brings me to the same issue I had with my previous problem: if numbers say nothing and can't be remembered, what's the use of these slugs? I prefer to see http://example.com/my-news/1/ than http://example.com/my-news/terribly-boring-and-long-url-that-replaces-the-number-I-liked-so-much/ To avoid forcing my user to memorize my URLS, I have added a script that finds the closest match to the URL you type, and redirects there. This is something I like, because the page now acts as a sort of little search engine, and users can play with the URLS to find articles. ** Open questions ** I still have some open questions, and don't seem to be able to find an answer, because answers tend to contradict one another. 1) How many characters should an URL ideally be long? I've read the magic number 115 and am sticking to that, but am not sure. 2) Is this really good for SEO? One of those blog articles I have redirected, with ID number in the URL and all, ranked second on Google. I've just found this question, and the answer seems to be consistent with what I think URL slug and SEO - structure (but see this other question with the opposite opinion) 3) To make a question with a specific example, would this URL risk to be penalized? Is it acceptable? Is it too long? StackOverflow seems to have comparably long URLs, but I'm not sure it's a winning strategy in my case. I just wanted to facilitate my users without running into Google's algorithms.

    Read the article

  • Public/Private Key Generation

    - by JacKeown
    I'm just learning about public key cryptography and I want to make a public key certificate for my web server so that I can use https. My server is hosted on some random free webhost that is practically impossible for anything...and so my question is this: Is there any harm in making my private key, public key, and public key certificate on my computer using openssl and then transferring it to the server? Thanks in advance. Also if there's anything else I'm missing, any help would be appreciated.

    Read the article

  • How to dynamic add Google analytics track code using php?

    - by foodil2
    I would like to add track code for each link of in my email content So , i have register a google analytic accounts and found that there is only 1 track code therefore, how to use php , given a google analytics id and password given, register for a new track code add each code to a link (need to use php to add a 1px * 1 px image for each link?) Return the codes added Thank you Besides, if i have to track the result in Google analytic (traffic source -campaign) or i can use an api that can integrate the result panel in Google analytic to my system ? Thank you again for any kindly help

    Read the article

  • Website migration is not working for all computers

    - by Shadowizoo
    We got 2 servers on same network, Server-A and Server-B. On Server-A (widows server 2003), we have IIS 5.2 and our website was hosted on it few month ago (about 7-8 months). We bought a new server, Server-B (Windows Server 2008) with IIS 7.5 and copied our old website on this new machine. On our router, we forward the port 80 to Server-B. The Server-A is still on because we need to access some old data by our old website. I would like to access it with it's internal Ip (192.168.1.xxx/mywebsite) On my Windows 7 computer, if I write www.example.com or example.com (without www.), I'm being redirected to Server-B and I can see our new interface. On some Windows Vista computer, example.com (without www.) redirect to Server-B, but if I write www.example.com, I'm still on Server-A. In our website code (on Server-B), we sometimes redirect with a "www." so this is causing some error because we are trying to access a webpage that exist on Server-B but not Server-A and because the www.example I compared 2 computers with Vista Home on them and Internet Options looks the same. I cannot figure why this is happening

    Read the article

  • Google indexed the same page under two URLs (despite rel-canonical)

    - by unor
    The Super User question "Playing mp3 in quodlibet displays “GStreamer output pipeline could not be initialized” error" is indexed under two URLs in Google: http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia/652058 The first one is the canonical one; the corresponding rel-canonical is included in both pages: <link rel="canonical" href="http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia" /> Google also indexed http://superuser.com/a/652058, which redirects to the answer: http://superuser.com/questions/651591/playing-mp3-in-quodlibet-displays-gstreamer-output-pipeline-could-not-be-initia/652058#652058 Now, the second URL from above is the same as this one minus the fragment #652058. So Google seems to strip the fragment, which results in exactly the same page under another URL (= containing the answer ID /652058 as suffix), and indexes it, too -- despite rel-canonical and duplicate content. Shouldn’t Google recognize this and only index the canonical variant? And what could be the reason why Stack Exchange includes the answer ID in the URL path, and not only in the fragment (resulting in various URL variants for the same page)?

    Read the article

  • Domain mapping issues

    - by Nadya
    I have two domain names - .com & .co.uk bought with 123-reg and just one student Windows hosting pack associated with the .co.uk domain. The .com domain is the main one which people would be trying to access, so I just mapped the domain to the hosting this morning. The problem is that I would really like it to be functional by tomorrow morning and the usual waiting time is 24-48 hours. Is there point in stopping the process and trying with forward it with CNAME record instead, does it take less time? (I can just go back and do proper domain mapping during the weekend) Also, is there a possible way to check whether the domain mapping has been done correctly before these 24-48 hours? From some computers I get 404 Error on homepage.

    Read the article

  • Can I use a 302 redirect to serve up static content from an URL with escaped_fragment?

    - by Starfs
    We would like to serve up SEO-friendly Ajax-driven content. We are following this documentation. Has anyone ever tried to write a 302 redirect into the .htaccess file, that takes the ?_escaped_fragment= string and send that to a static page?, for example /snapshot/yourfilename/. How will Google react to this? I've gone through the documentation and it's not very clear. The below quote is from Google's documentation this is what I find. I'm not sure if they are saying that you can redirect the _escaped_fragment_ URL to a different static page, or if this is to redirect the hashtag URL to static content? Thoughts? From Google's site: Question: Can I use redirects to point the crawler at my static content? Redirects are okay to use, as long as they eventually get you to a page that's equivalent to what the user would see on the #! version of the page. This may be more convenient for some webmasters than serving up the content directly. If you choose this approach, please keep the following in mind: Compared to serving the content directly, using redirects will result in extra traffic because the crawler has to follow redirects to get the content. This will result in a somewhat higher number of fetches/second in crawl activity. Note that if you use a permanent (301) redirect, the url shown in our search results will typically be the target of the redirect, whereas if a temporary (302) redirect is used, we'll typically show the #! url in search results. Depending on how your site is set up, showing #! may produce a better user experience, because the user will be taken straight into the AJAX experience from the Google search results page. Clicking on a static page will take them to the static content, and they may experience avoidable extra page load time if the site later wants to switch them to the AJAX experience.

    Read the article

  • How to configure Google sitemap links? [duplicate]

    - by Alexander Farber
    This question already has an answer here: What are the most important things I need to do to encourage Google Sitelinks? 5 answers I run a Wordpress 3.7.1–de_DE sit, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 links displayed underneath: I believe these links are called "Google sitemap" and my question is how to configure them in Wordpress. Because while the right link is pointing to the /ueber-mich URL at the website, the left link was pointing to an non-existing /imprint and I had to add that webpage as a workaround for now. And I'd like to change the /imprint to German /impressum anyway (currently I use mod_rewrite to redirect).

    Read the article

  • Formatting Google Search Result [closed]

    - by user5775
    Possible Duplicate: What are the most important things I need to do to encourage Google Sitelinks? Hello, I am new to search engine optimization. I am working on customizing how my results appear in Google as best as possible. I have learned about the meta tags to customize the text summary. However, I have some hierarchical parts to my website. When a result appears related to the "tip-of-the-iceberg", I would like to show links related to the "child" pages. For instance, if you Google "Walmart" you will see the following links listed with the result: Electronics TV & Video Departments Furniture Toys Girls Living Room Computers Is there any way that I can help Google determine which links to show and the text to display for these child links on my site? Or is this something that Google automatically generates? thanks!

    Read the article

  • Making a language switch main menu button in Drupal

    - by Let_Me_Be
    I have a bilingual site in Drupal. The problem is that I hate the language switch block taking up so much space (sometimes the only thing in the sidebar is the language switch block). So what I would love to have is language switch menu item, that would point to the other language (other then the current one). Something like this: | Home | Projects | BlaBla | | Cesky | after swith: | Domu | Projekty | Blabla | | English | Is that possible without writing a whole new module?

    Read the article

  • In Google Analytics, how can I determine the value of a page if no goals or revenue have been determined?

    - by Brandon Durham
    I have 4 years of data in Analytics with over 20 million pageviews for the entire site. No goals have ever been set up, and while the site is an ecommerce site, no ecommerce features in Google Analytics have ever been taken advantage of. So I have no way to determine what the actual value of a page is. I've been tasked with determining if a particular page on the site is worth keeping around. How might I use all standard data (pageviews, bounce rate, time on page, time on site, etc.) to help determine the value of this page? I really appreciate any help I can get!

    Read the article

  • Should I use subdomains or subfolders for my user groups?

    - by bilygates
    Hello, I run a photography website where each user has its own subdomain (i.e. user.site.com). I'm thinking of adding user groups but I'm unable to decide if I should also associate a separate subdomain or simply a subfolder for each group: Subfolders (www.site.com/groups/my-group) Pros: Easier to maintain from a tehnical p.o.v. Cons: Harder to memorize. The URLs can get really long (www.site.com/groups/my-group/albums/my-album/) Subdomains (my-group.site.com) Pros: Easier to memorize. Shorter URLs. One might have the impression that such an URL is somewhat more "independent" from the main site. Cons: Group and user names belong to the same name space, so we need to check for collisions when creating a new user/group. One cannot determine the content of the page by only reading the URL: Is x.site.com a user page or a group page? What's your opinion on the matter? I should note that DeviantArt.com uses the 2nd option (that's where I got the idea). Thank you in advance!

    Read the article

  • Disqus 2012 comments NOT being indexed by Google

    - by Buckers
    We run a high-traffic website at http://www.onedirection.net and we've been using Disqus throughout this year, initially to great effect. We accepted the upgrade to Disqus 2012 back in June, loving the increased user experience and the better community feel - albeit back to an Iframe again. However the fact we were specifically told that the comments are now being indexed by Google was great, and the dynamic nature of the iFrame suited our site (all our pages are cached, so by using Disqus the comments are updated straight away). However, it seems that the Disqus 2012 comments are not being indexed, and we've noticed an obvious fall in traffic over the last few months. Initially we didn't put this down to Disqus and focused on other issues (Google algorithm updates etc). But we're quickly coming down the reasoning that our pages now contain less indexable text, and we are getting less traffic because of this. We've tried emailing Disqus directly but they're very slow and don't seem keen to help. Any thoughts on this?

    Read the article

  • Which Hosting for a Mobile Infotainment Portal? [closed]

    - by VenomVipes
    Possible Duplicate: How to find web hosting that meets my requirements? I am building up a portal for music and movie downloads. My target is 85% Mobile User. I expect (acc to our promotion plan) 1000 Visitors a day and at least 300 parallel downloads. Files are mp3 & 3gp format. Please suggest the type of hosting Cloud/VPS/Dedi. Please suggest config (suppose I have 50Gb Video & 30Gb Audio). We have a budget of around 70$/m for Server. Also suggest if I should keep all files in my HostServer or just put links to direct downlad frm other sites offering the same video. My goal is Fast Loading on Mobile Browser (2G GPRS). Also faster download of files.

    Read the article

  • Create subdomains via C-Panel or via domain registrar?

    - by cybergeek654
    I am a novice, so excuse me if it sounds dummy. I read a few similar questions on this topic here but they did not answer my question. I have a personal website, hosted with C-panel control panel. Via C-panel I can easily create new subdomains and I do not need to apply any DNS settings or so. And the subdomian is good to go immediately and are working fine. I have unlimited subdomains option. When I check my DNS management control panel in my domain registrar's site, there is no record associated with my subdomains. Now I want to buy a new domain name from 1and1, and have it as an add-on domain in my prevouse host. 1and1 say they only allow 5 subdomains. What does this mean? Can not I create unlimited subdomains under my new domain name, just as I do for my old domain? How does C-panel create and manage subdomains that there is nothing about it in my DNS control panel? Thanks for your help

    Read the article

  • IIS to parse php in a .dll files

    - by Agony
    The .dll files ain't the dynamic link library. That's what the client side software calls for (cannot change). Its essentially a php script that should run and return specific values. However currently it simply downloads it and that results in a failure. That's what it results in on a Apache server: [Update] NewVersion=1 UpdateFileNumber=1 UpdateFile1=update1/LPServerInfo.dat ServerNumber=1 Server1=http://88.159.116.217/ here it is on IIS: 198.24.133.74:8080/update.dll?0 renaming it to php works fine for testing - it runs and returns values. I edited the MIME and set .dll to application/x-httpd-php but that doesn't seem to work in IIS. Any solutions?

    Read the article

  • SEO penalty for landing page redirects

    - by therealsix
    Using ebay as an example- lets say I have a large number of items whose URLs' look like this: cgi.ebay.com/ebaymotors/1981-VW-Vanagon-manual-seats-seven-/250953153841 I want to give my client the ability to put links to these items on their website EASILY, without knowing or checking my URL. So I created a redirect service that will map their identifier with my URL: ebay.com/fake_redirect_service/shared_identifier9918 would redirect to the link above. This works great- my clients can easily setup these links with information they already have, and the user will see the page as usual. So on to the problem... I'm concerned that this redirecting service will have a negative impact on my SEO ranking. Having a landing page redirect you immediately to a different URL seems like something a typical spam site would do. Will this hurt me? Any better solutions?

    Read the article

< Previous Page | 229 230 231 232 233 234 235 236 237 238 239 240  | Next Page >