Search Results

Search found 28303 results on 1133 pages for 'multi site'.

Page 40/1133 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • How to push changes from Test server to Live server?

    - by anonymous
    As a beginner, I finally noticed the issue with making changes to the live server I've been working on, now that I have a couple users on it, since I bring it down so often. I created an EC2 image of my live server and set up a separate instance on EC2, so now I have 2 EC2 instances, Stage and Production. I set up GitHub and push changes to stage and test my code there, and when it's all done and working, I push it to the production branch, and everything is good. And there is a slight issue here since I name my files config_stage.js and config_production.js and set up .gitignore on each server, and in my code, I would have it read the ENV flags and set up the appropriate configs, is this the correct approach? And my main question is: how do you keep track of non-code changes to the server? For example, I installed HAProxy, Stunnel, Redis, MongoDB and several other things onto the Stage server for testing and now that it's all working and good, how do I deploy them to production? Right now, I'm just keeping track of everything I installed and copying configuration files over, which is very tedious and I'm afraid I may have missed a step somewhere. Is there a better way to port these changes over from my test server to my live server?

    Read the article

  • How to add DNS txt record in cpanel and what to name it?

    - by Lars Holdgaard
    I have a domain, where I have to add a DNS text change. More specifically, I have to do the following: "You should now create a DNS text record with the meta tag value shown below for the domain you're securing." The value I should insert is this one: globalsign-domain-verification=list_of_random_chars How do I add this in cPanel? I thought about doing it this way, but I have to add a name: I also thought about adding it like this: So my question really is: how do I add this txt file in a correct way?

    Read the article

  • Block a website on HTTPS [closed]

    - by momo1729
    I would like to block some websites on their HTTPS version and allow them on HTTP. The main websites involved are Youtube and Google Images/Videos. This is because on the HTTP version, I can enforce the Safesearch filter on those platforms, whereas I cannot on the HTTPS version. For me, this is a very serious issue which spoils many great things about the Safesearch features Google offers. Is there any software/config that can do that? P.S.: I'm not sure this is the right place to post this question in, maybe you could redirect me to some other SE platform?

    Read the article

  • 600 visitors per day, 20 backlinks but still not referenced by Google

    - by Tristan
    Hello, i've launch a website on wednesday (9 August) There's already in 4 days 12,000 viewed pages 1,400 visits 20 to 25 backlincks a sitemap.xml (130 URL) english language / french language - url like that "/en/" "/fr/" BUT, i'm still not referrenced by google In the google webmaster i have 0 backlinks 130 in sitemap, 0 URL referrenced on google For a smaller website, i remember that i took me less to appear on google with less visits. My url is: http://www.seek-team.com/en/ for english and juste replace /en/ by /fr/ to access it in french. What's causing this ? Is there an explanation ? Thanks for your help (ps, i've already checked robots.txt)

    Read the article

  • Why Alexa has two rankings for my website?

    - by MIH1406
    For the following website: Noaoomah Q&A I have two different Alexa rankings as follows: 1) The public ranking on Alexa siteinfo page of the website, that is the usual ranking page and it indicates a rank of 318,254 which they claim it is updated daily: http://www.alexa.com/siteinfo/noaoomah.com 2) Another public and daily ranking of the same website but it is viewable using either the freely avaliable list for Top 1,000,000 Sites in this page at the almost top right or using StatsCrop website and this ranking indicates a rank of 253,753. Which one is more accurate? Why different daily rankings?

    Read the article

  • index.html in subdirectory will not open

    - by Dušan
    I want to have a website structure like this example.com/ - homepage example.com/solutions/ - this will be the solution parent page example.com/solutions/solution-one - child solution page example.com/solutions/solution-two- child solution page i have setup the example.com/solutions/index.html file so it can be opened as a parent page, but is shows me an error You don't have permission to access /solutions/.html on this server. What is the problem to this, how can i open parent, directory page? I na just using regular html pages, no cms or anything...

    Read the article

  • How to remove HTML code from search result page content

    - by Jack Torris
    I have music website. There are 46 album pages and each page has different player and files. I just entered the one of album's URLs in a search engine. I found that Google is displaying player code in search result content. For example, enter this URL in Google and check the results. Each result displays a .mp3 file in content section. I see this: This page contains a demo of and documentation for the new jPlayer Playlist add-on, ... mp3:"http://www.jplayer.org/audio/mp3/Miaow-01-Tempered-song.mp3", ... I don't want Google to show the player code and mp3 files in search result. How can I hide audio files and player code from search engine? What would be the best solution for it?

    Read the article

  • How to verify real people?

    - by Gerben Jacobs
    For a music community, I want bands to be able to verify themselves. What is the best way to do this? For example I could let the record label mail me, but some bands are indepedent. I could also ask them to put a 'code' on their website or Facebook page and then check manually. I'm not per se looking for a waterproof solution, so no scanning of real life documents and I'm okay with doing the checks manually. In other words, how can you verify real people with their virtual presence?

    Read the article

  • Challenge Accepted

    - by Chris Gardner
    Originally posted on: http://geekswithblogs.net/freestylecoding/archive/2014/05/20/challenge-accepted.aspxIt appears my good buddies in The Krewe have created The Krewe Summer Blogging Challenge. The challenge is to write at least two blog posts a week for 12 weeks over the summer. Consider this challenge accepted. So, what can we expect coming up? I still have the Kinect v2 Alpha kit. Some of you may have seen me use it in talks. I need to make some major API changes in The Krewe WP8 App. Plus, I may have Xamarin on board to help with getting the app to the other platforms. I am determined to learn F#, and I'm taking all of you with me. I am teaching a college course this summer. I want to post some commentary on that side of training. I am sure some biometric stuff will come up. Anything else you guys may want. I have created tasks on my schedule to get a new blog post up no later than every Tuesday and Friday. We'll see how that goes. Wish me luck.

    Read the article

  • If not now, then when?

    - by Chris Gardner
    Originally posted on: http://geekswithblogs.net/freestylecoding/archive/2013/10/25/if-not-now-then-when.aspx The time has been flying by this year. It seems like only yesterday that I mentioned the gorillagator, a simple construct of confusion to try to draw attention to my message. In reality, that message was sent over a month ago. During that time, the hours slipped to days and days to weeks. Many exciting things have happened to myself; I'm sure many exciting things have happened to you. I'm also sure that many terrifying things have happened to children and their families. 62 children enter treatment at a Children's Miracle Network Hospital every minute. That's nearly 60,000 children since I sent the last email. To put that number in perspective, that is more than the population of Greenland. If we expand that to the past year, they have been nearly 550,000 children treated. That is almost the population of Huntsville, Decatur, and all their suburbs combined. Over the past 4 years, I have raised a little more than $3,000 for Children's Hospital of Alabama. As a result, I received a call from the organizers of Extra Life thanking me for my dedicated work and informing me that I was the top supporters for Children's Hospital of Alabama ... with my measly three grand. We can do much better than that. It may sound like I'm trying to have fun by playing games for 24 hours. It is more than that. It is me using my time and body as a catalyst. It is me putting my passion to work for a cause. It is me turning my love into something tangible. I have been campaigning and fighting to give these children a chance for years. I have been asking you to help me support these children and families. I've been putting in countless hours of talking to people, impassioned emails, and carefully constructed tweets. I have been fighting with cutting edge, and sometimes expensive, technology to try to provide live streams of my marathons. I yearly put my body through 24 (and, this year, 25) hours of no sleep. I do this to represent the countless hours these families sit awake at their children's side. All I ask is a few minutes on a website and a few dollars. These few minutes and few dollars go a long way help people that are experiencing circumstances that only occur in our nightmares. I also ask that you take one extra step. Forward this plea to those that you know. I can only reach a small fraction of a percentage of the people that may be able to help. Together, we can reach the world. I raise money for Children's Hospital of Alabama. As this message branches out, people may wish to support a hospital closer to their area. I have included a link to the list of people that have dedicated their time and have received no donations. Find someone on the list supporting your local hospital and give them a donation. Let them know that their time and effort are appreciated. Together, we can do something great. Together, we can make a difference. Together, we all stand tall. Thank you. You can get more information at http://www.extra-life.org and http://childrensmiraclenetworkhospitals.org/" My donation page is http://www.extra-life.org/participant/cgardner The list of participants without donations is http://www.extra-life.org/index.cfm?fuseaction=donorDrive.eventParticipantList&page=629&eventID=512

    Read the article

  • Hangul calligraphy (TTF)

    - by 2x2p1p
    Hi guys. I want a nice hangul font. Can somebody indicate one ? Something elegant and beautiful like this England calligraphy: I would like to apply it using css 3: <!DOCTYPE html> <html> <head> <meta charset = "utf-8"> <style> @font-face { font-family: "hangul"; src: url("hangul.ttf"); } body { font-family: hangul; } </style> <title></title> </head> <body> ? ? ? </body> </html> Thanks

    Read the article

  • Search ranking for important keywords has gone down drastically [duplicate]

    - by Vaivhav
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers Firstly, we are a small entrepreneurial team of 3 persons and I am more like an amateur webmaster of the company's website as we cannot really afford a technical guy/department right now. A few weeks earlier, our website traffic and rankings for most keywords decreased overnight. I did a lot of reading henceforth and learned about Penguin 2.1 which people said is the reason for the drop. Something like this had never happened before. Now, I have gone through the entire Google webmaster help section. It says there that if a manual penalty is taken against us, we would notice a message in Manual Actions page. So far, we haven't received any notice from Google for web spam. Some SEO guys I contacted said they found spam links in our backlink profile. I do believe I had mistakenly purchased a cheap link/SEO scheme when I was yet very new to SEO. This was more than a year back but since then we have been legitimate. Moreover, how do I find out which is a spam link and which is not? Our content is all original, refreshing and the best you will find in our niche. We also have a blog but on a different domain (wordpress.com) from where we send out anchored links to our business website. Is this a good thing to do? Now, how should we proceed and recover our traffic/rankings. I tried searching in webmasters for a way to reach google and ask them why the traffic has decreased suddenly, but I couldn't find a contact form or something. Can someone please go through our website and help in making things more clear regarding the reason for the drop, along with a solution. Will really appreciate this as I can't get to figure this out and its taking a lot of time. Vaivhav

    Read the article

  • Website signaled as containing malware

    - by Bakaburg
    I've got a nasty problem with one of our websites. It has been signaled to us by Google and other agencies that it contains malware. We weren't able to understand how to cope with the problem. Could anyone drive us in the right direction? UPDATE: I used google webmaster tools to review the suspicious website. And now it says it's ok! Even if I didn't change anything! How could it be? false alarm?

    Read the article

  • If an visitors IP address contains "google" or a similar keyword, does this mean they were a crawler?

    - by Roscoe
    Hi, I have a huge list of IP addresses recorded from various visitors to a website. A huge amount of the visitors, in some months over 70%, came from IP addresses that contained keywords such as google, yahoo, bot, crawler, etc. Does this mean that those users were infact search engine crawlers? If so, why are their so many crawlers in my visitor records in comparison to genuine human visitors? (and if not what's the explanation?) Thanks in advance.

    Read the article

  • Is Azure Compatible with JPEG XR?

    - by Shawn Eary
    I just put an F#/MVC app into a Windows Azure solution as a Web Role. Before migration, my JPEG XR (*.WDP) files were getting displayed on the client in IE9 without issue via my local and hosted sites. Now, after migration into Windows Azure, my JPEG XR files neither get displayed in my local Windows Azure compute emulator nor do they get displayed when they are deployed to http://*.cloudapp.net. Is there some sort of conflict with Widows Azure and (JPEG XR) *.wdp files? If so, what is the accepted best practice for overcoming this conflict?

    Read the article

  • Should users be deleted after inactivity on a website?

    - by Hovaness Bartamian
    When you have a social website or a website where you can register, would you eventually delete them after a certain time (after a year of inactivity) or would you rather keep their account records for ever? I know websites like Facebook have large amount of inactive, duplicated and fake accounts. So I'm wondering if after two years of inactivity it would be alright to send the account a warning email of deletion unless they log in. Just thinking of a clean and efficient database management or any implications this may cause to new potential users.

    Read the article

  • Self-imposed lockout from program

    - by Alex
    I'm plagued with a lack of willpower. I recently started looking for solutions, and came across a program for macs called SelfControl which completely blocks one's access to a given set of websites for a given period of time (you can delete the program/restart your computer/do almost anything and it will still block those sites for the specified time period, and doesn't require a password to do it.) Unfortunately, there are no windows analogues. The one that comes the closest is Cold Turkey. It has the functionality whereby you set a time in the future, specify a list of websites (or programs - eg explorer, firefox, chrome) and you are blocked from accessing them for the whole duration. No password can undo it, no system reboot, etc. The problem is that the program is a buggy piece of garbage, and in order to ensure that you're not locked out from websites forever, you have to run an uninstaller which is just an exe file accessible at any time which completely defeats the purpose of a self-imposed program lockout. I want to make a better version of that program, or find a simple way to prevent access to a given set of programs over a given period of time with no way around it. I've only taken a few introductory courses in java (math major), but the internet is really having a negative effect on my studies, and the only way I can do work is to eliminate all distractions. What do I need to learn in order to make a program with the following properties: Given a set of .exe files, and a time in the future , this program will prevent access to the given .exe files until current time = given time restarting the computer doesn't interfere with the program, one can't uninstall the program until current time = given time, one can't create another instance of the program to block itself I don't care how much programming knowledge i need to acquire in order to make this program, so please give me a specific list of things that I need to study in order to make this happen, or if something like this exists, then please let me know.

    Read the article

  • Deploying my first website!

    - by test
    I have built a data driven website - an asp.net website, using the entity framework. In my solution I have 4 projects - the web application PresentationLayer, and 3 class libraries - Data Layer, Business and Common Layer. In one of these libraries, Common Layer I have my Model (MyModel.edmx). I have always tested my application on Cassini - Asp.Net Development Server. I have never touched IIS in my life. I bought a domain and hosting on go daddy. My logic tells me to grab my four folders (1 for each layer) and simply move them to the root folder. But I know I'm wrong since then the home page would be mywebsite.org/presentationlayer/default.aspx and second of all I start getting a bunch of errors where files do not load or they are not found. I also know that I need to manage the web.config but I don't have any experience where to start. I'm not sure if this is a problem but I also have a web service included in my presentation layer.

    Read the article

  • Javascript Only Search Method [on hold]

    - by user2118228
    I need to put a search function on a website that is going to be on a CD-ROM with no access to the internet. It has 80 pages, and about 500 'items', so I'd prefer to not have to hard code 100's of 'if statements if possible. I've found a few programs you can buy that will index and generate results (Zoom Search, JSS Index, The German Guys') but there are odd quirks with each one. Plus I would rather code it myself to get complete control over it, and to really understand what it's doing. Basically searching for a few words would display the product image and description; clicking on that would take you the related URL. This is kind of complicated, I can't find an easy solution not dealing with hundreds of if Statements. Has anyone ever created anything like this or know a better method? I'm not really sure a better way to go about this. I've used PHP/MYSQL for search results before, but this cannot run any php.

    Read the article

  • How to grep (or find) on cPanel?

    - by San
    How can I search for a specific string (function name or a variable name) in my files which are in various directories under cPanel file manager? I have been using a library directory and functions on that directory are used in various apps and pages. Now, I am in a situation to change something in the library file, for which I need to know the impact on files which use this library file functions. How to search / find / grep through the files hosted?

    Read the article

  • Facebook multi-friend-selector + new javascript API = BROKEN ?

    - by Simon_Weaver
    I am using the fb:serverfbml tag to render a multi-friend-selector inside an IFrame. I am using the new javascript API. I have been trying ALL DAY to get it working. When I click on the underlines 'selected' link (to filter by the selected friends) the whole page refreshes and the selected friends disappear. Does the multi-friend-selector just not work with the javascript API? <fb:serverfbml> <script type="text/fbml"> <fb:request-form action="http://apps.facebook.com/rollingrazor/" target="_top" method="POST" invite="true" type="Blah blah blah" content="Blah blah! &lt;fb:req-choice url=&quot;http://apps.facebook.com/rollingrazor/&quot; label=&quot;Let me check my friends&quot; /&gt;"> <fb:multi-friend-selector showborder="false" actiontext="Invite your friends" rows="5" cols="5" bypass="cancel" target="_top" /> </fb:request-form> </script> </fb:serverfbml> <div id="fb-root"></div> <script> window.fbAsyncInit = function () { FB.init({ appId: 'xxxxxxx', status: true, cookie: true, xfbml: true }); }; (function () { var e = document.createElement('script'); e.async = true; e.src = document.location.protocol + '//connect.facebook.net/en_US/all.js'; document.getElementById('fb-root').appendChild(e); } ()); </script> Can someone give me a working example using new javascript API with a multi-friend-selector?

    Read the article

  • Rails: Multi-Step New User Signup Form (FSM?)

    - by neezer
    I've read the "Create Multi-Step Wizard" in Advanced Rails Recipes. I've also read and re-read the documentation for the updated FSM I'm using called Workflow, and looked here and here. The Advanced Rails Recipe focuses on records (quizzes) that already exist, and doesn't cover creating new ones. The Workflow docs don't cover any code for controllers or views, so I've no idea what to do with all this model magic, and the last two links barely touch on implementation either. From the aforementioned resources, I have a good understanding of what a FSM in Rails is and how to play with it in the console or IRB, but I've got very little direction or understanding how to implement one into my Rails app. What I would like is this: a simple, multi-step user signup process. Step 1: User enters in their critical details (with validations). Step 2: User enters in their search criteria, for their profile (with validations). Step 3: User agrees to the Terms of Service (with validations). Step 4: User is greeted by a confirmation page, including a link that takes them to their newly created account. I'd also like full navigation between the steps and full capture (saves to the database) with each transition. Can someone please give me a clear implementation of something similar to this? I would LOVE an example app that includes a multi-step signup process where I can look at the code (FULL source code--models AND controllers and views) under the hood, but I've been unable to find anything like that. Any guidance would be appreciated! EDIT: Please help make this a Railscast! Ryan B. (a.k.a. Superman), if you're reading this, we need you! http://feedback.railscasts.com/forums/77-episode-suggestions/suggestions/35553-multi-step-forms-and-wizards

    Read the article

  • Avoid MySQL multi-results from SP with Execute

    - by hhyhbpen
    Hi, i have an SP like BEGIN DECLARE ... CREATE TEMPORARY TABLE tmptbl_found (...); PREPARE find FROM" INSERT INTO tmptbl_found (SELECT userid FROM ( SELECT userid FROM Soul WHERE .?.?. ORDER BY .?.?. ) AS left_tbl LEFT JOIN Contact ON userid = Contact.userid WHERE Contact.userid IS NULL LIMIT ?) "; DECLARE iter CURSOR FOR SELECT userid, ... FROM Soul ...; ... l:LOOP FETCH iter INTO u_id, ...; ... EXECUTE find USING ...,. . .,u_id,...; ... END LOOP; ... END// and it gives multi-results. Besides it's inconvenient, if i get all this multi-results (which i really don't need at all), about 5 (limit's param) for each of the hundreds of thousands of records in Soul, i'm afraid it will take all my memory (and all in vain). Also, i noticed, if i do prepare from an empty string, it still has multi-results... At least how to get rid of them in the execute statement? And i would like to have a recipe to avoid ANY output from SP, for any possible statement (i also have a lot of "update ..."s and "select ... into "s inside, if they can produce multi's). Tnx for any help...

    Read the article

  • FBA site owner encounter access deny in sharepoint 2007

    - by intangible02
    I created a sharepoint 2007 publishing site first using windows authentication, then extended it to another site using FBA. I created a FBA user and set it as site collection admin as well as top site owner. I also make application pool which the FBA site is running in to run with a user account which is within administrator group. But I encounter access deny error when browsing certain links using this site owner account. Is there other settings I need to configure? I found in the web.config, the impersonation is set to true. How does this affect the access rights?

    Read the article

  • How to Add a File from my source tree to Maven Site

    - by Charles O.
    I have a Maven 2 RESTful application using Jersey/JAXB. I generate the JAXB beans from a schema file, where the schema file is in my resources directory, e.g., src/main/resources/foo.xsd. I want to include foo.xsd file in the generated Maven site for my project, so that clients can see the XML schema when writing RESTful calls. How can I include foo.xsd in the site? I could have a copy of the file in src/main/site/..., and then update my site.xml to point to it (or have a .apt whose contents point to it), but I don't like that because I'm still tweaking foo.xsd, and don't want to have to remember to copy it each time I update it. And that's just bad practice. I also tried having a .apt file that has a link to the foo.xsd which gets copied to the target/classes directory. That works until I do a site:deploy, because that only copies the target/site directory. Thanks, Charles

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >