Search Results

Search found 9728 results on 390 pages for 'meysam pro'.

Page 76/390 | < Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >

  • Anyone know a way to find out number of Twitter followers for a particular account on any given date?

    - by mejpark
    Hello. I manage two corporate Twitter accounts and two personal Twitter accounts, and it would be really useful to know how many followers each account has at the end of each week. I'm using TweetStats.com to gather statistics at the moment, but the follower stats functionality isn't granular enough to determine the precise number of followers. Does anyone know of any useful and free tools that can provide the exact number of followers for a specific Twitter account on any given date? Thank you, Mike.

    Read the article

  • Add second site through iframe

    - by Anna Danson
    I have two blogs on Tumblr. Let's call them Pets.Tumblr.com and Cats.Tumblr.com A while ago I decided to make Pets.tumblr.com my main blog, but since Cats.tumblr.com grew more popular, I decided to merge these sites together. What I have done is this: I've created a blank page on pets.tumblr.com/cats, put a full sized iframe with cats.tumblr.com as source, and a jquery redirect script in cats.tumblr.com that redirects to pets.tumblr.com/cats I'm wondering if this would impact my site negatively? Will search engines see pets.tumblr.com/cats as a blank site (iframes are ignored?) and cats.tumblr.com as a spam site because it redirects to a blank one?

    Read the article

  • Apache cannot find mysql database modules

    - by user809857
    I've created a simple django project and setup a mysql database. My simple project just creates an entry on the database. The project works fine when I use the built in development server provided by django (runserver) and it works well. But when I deployed the project on Apache and mod_Wsgi (Ubuntu server), django could not find 'books', which is in this case my table in the database. The mysql database that I use in runserver and apache are just the same. I also did rebuild the database using sqlall,validate and syncdb of django but still i get the error. What could be wrong with what I'm doing? Thanks

    Read the article

  • cloudflare's mx record should set cname or A records

    - by user7787
    The cloudflare offical support said https://support.cloudflare.com/hc/en-us/articles/200168876-My-email-or-mail-stopped-working-What-should-I-do- But traditionally mx record should not set as cname http://www.exchangepedia.com/blog/2006/12/should-mx-record-point-to-cname-records.html But cloudflare has a service called "cname Flattening" is it related for a reason to set cname as mx records? So should i set cloudflare's mx record as cname ?

    Read the article

  • DNS - domain conflict?

    - by Stefanos.Ioannou
    I was given two domains: domain.com & domain.info (they are on GoDaddy). And I was also given two servers, 107.105.38.99 - Rails app and 107.107.90.17 - Wordpress platform, on Digital Ocean. At first, I was instructed to associate domain.com with the 107.107.38.99 (Rails app). Then I was instructed to de-associate this IP with domain.com and associated the 107.107.90.17 with the domain name domain.com. Then I was instructed to associated domain.info with the 107.107.38.99 (Rails app). Right now, when I go to domain.com the WordPress platform (107.107.90.17) loads fine and that is what is expected. But when I go to domain.info for the Rails app (107.107.38.99) I get redirected to domain.com. This is not expected and this is really weird for me. When I ping domain.info I get this: PING domain.info (107.107.38.99): 56 data bytes 64 bytes from 107.107.38.99: icmp_seq=0 ttl=50 time=74.601 ms Which is the expected result showing the correct IP but I don't understand why I get redirected to domain.com...(which when I ping is:) domain 64 bytes from 107.107.90.17: icmp_seq=0 ttl=50 time=75.057 ms The PTR Records on Digital Ocean are as follows: IP Address PTR Record 107.107.38.99 domain.info. 107.107.90.17 domain.com. and the DNS configurations on Digital Ocean are: domain.com A: @ 107.107.90.17 CNAME: * @ domain.info A: @ 107.107.38.99 CNAME: * @ I am not sure what the issue is, if you have any clue please let me know, I will be really grateful. If you need any other info let me know.

    Read the article

  • Connecting Google Analytics with Custom Search Engine AdSense

    - by Yochai Timmer
    I have a Custom Search Engine that I've created with AdSense. I've put that search engine as a site search in my Google Sites page. I've connected both the Custom Search Engine and the Google Site to my Analytics page via their settings pages. Now, I'm trying to get Analytics to show me the AdSense for Search statistics. I've managed to connect the Google Sites page, to the Analytics, and I can see the search statistics in the Analytics as well. But I can't get it to show the actual AdSense for Search statistics from the Custom Search Engine. How can I configure everything so I can get the AdSense for Search statistics of my Custom Search Engine in my Analytics page?

    Read the article

  • Ajax site not being crawled - have escaped fragment, what's wrong? [closed]

    - by Harry
    My site is anonkun.com. You can see that it's "ajax" and doesn't load much HTML. Here are some example pages: http://anonkun.com http://anonkun.com/?_escaped_fragment_= http://anonkun.com/stories/Dev-kun---FAQ/6ef881f8-cf48-4f87-a688-c585f23809c5 http://anonkun.com/stories/Dev-kun---FAQ/6ef881f8-cf48-4f87-a688-c585f23809c5?_escaped_fragment_= As you can see the original page has the meta fragment tag and the escaped fragment versions loads static html. Why am I not getting crawled? http://cl.ly/image/2n30212q0K2W Webmaster tools show that pages are being seen as duplicate and fetch as google show me the ajax version of the source not the static escaped fragment version. What's wrong and how do I make this work? Thanks.

    Read the article

  • are keywords in URLs good SEO or needlessly redundant?

    - by Blazemonger
    A coworker and I are locked in a debate over the value of SEO keywords in the URL of a page. She wants to change all the filenames of the HTML pages of a fencing company so they look like residential-home-chicago.html, contact-chicago-contractor.html, and so on. She is convinced that because Google highlights keywords in URLS in search results, that means that putting keywords here is more valuable. My position is that these do not improve SEO, since Google doesn't seem to give keywords in the URL any more weight than keywords in the body of the page, and might even give them less weight. In the meantime, they make it harder for me to find the pages I want when its time to edit them, and the site as a whole looks cheap and spammy. Google's own SEO guide suggests to me that yes, keywords in URLs are useful, but not superior, and that they are more useful for human readability than search engine rankings. I'm looking for authoritative sources that support either position, not blog articles from SEO optimization companies trying to promote themselves.

    Read the article

  • How to filter a mysql database with user input on a website and then spit the filtered table back to the website? [migrated]

    - by Luke
    I've been researching this on google for literally 3 weeks, racking my brain and still not quite finding anything. I can't believe this is so elusive. (I'm a complete beginner so if my terminology sounds stupid then that's why.) I have a database in mysql/phpmyadmin on my web host. I'm trying to create a front end that will allow a user to specify criteria for querying the database in a way that they don't have to know sql, basically just combo boxes and checkboxes on a form. Then have this form 'submit' a query to the database, and show the filtered tables. This is how the SQL looks in Microsoft Access: PARAMETERS TEXTINPUT1 Text ( 255 ), NUMBERINPUT1 IEEEDouble; // pops up a list of parameters for the user to input SELECT DISTINCT Table1.Column1, Table1.Column2, Table1.Column3,* // selects only the unique rows in these three columns FROM Table1 // the table where this query is happening WHERE (((Table1.Column1) Like TEXTINPUT1] AND ((Table1.Column2)<=[NUMBERINPUT1] AND ((Table1.Column3)>=[NUMBERINPUT1])); // the criteria for the filter, it's comparing the user input parameters to the data in the rows and only selecting matches according to the equal sign, or greater than + equal sign, or less than + equal sign What I don't get: WHAT IN THE WORLD AM I SUPPOSED TO USE (that isn't totally hard)!? I've tried google fusion tables - doesn't filter right with numerical data or empty cells in rows, can't relate tables I've tried DataTables.net, can't filter right with numerical data and can't use SQL without a bunch of indepth knowledge, not even sure it can if you have that.. I've looked into using jQuery with google spreadsheets, doesn't work at all either I have no idea how I'm supposed to build a front end with my database. Every place that looks promising (like zohocreator) is asking for money, and is far too simplified to be able to do the LIKE criteria or SELECT DISTINCT stuff.

    Read the article

  • htaccess redirect problem

    - by jimbo
    Hi all, I am currently building a site and want to hide all development work on the site. I am using a htaccess file and redirecting to my holding page index.php?id=7: Options +FollowSymlinks RewriteCond %{REQUEST_URI} !/index.php RewriteCond %{REQUEST_URI} !/assets/ RewriteRule $ /index.php?id=7 [R=307,L] This is working for pretty much all pages, but, changing index.php?id=7 to another number id=6 for example still shows the page with no redirect. Any help welcome...

    Read the article

  • Need help with some mod_rewrite on lighttpd

    - by Christoph
    Hello, I recently couldn't configure my mod_rewrite where I'm using Wordpress and MyBB. And now I need Your help, because I couldn't deal with it... Here is the code: The problem is with third, fifth and sixth line. On the third, it couldn't display comments (error 404). On fifth, forum categories are not working. Finally on sixth, post aren't working. I appreciate, any help. Thanks! EDIT: You're saying the truth. What I want, is for example: From link: example.com/forum/forumdisplay.php?fid=71 to example.com/forum/post// <= only id is important here, post name is only for seo Same thing with 'dzial'. And Wordpress From: example.com/portal/comment-page-#comment-1995 <= name, id and number of page is important here. Any ideas, how to deal with it?

    Read the article

  • Error using SoapClient() in PHP [migrated]

    - by Dhaval
    I'm trying to access WSDL(Web Service Definition Language) file using SoapClient() of PHP. I found that WSDL file is authenticated. I tried with passing credentials on an array by another parameter and active SSL on my server, still I'm getting an error. Here is the code I'm using: $client = new SoapClient("https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl",array("trace" = "1","Username" = "username","Password" = "password")); Here is the error I'm getting: Warning: SoapClient::SoapClient(https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl) [soapclient.soapclient]: failed to open stream: Connection timed out in PATH_TO_FILE on line 80 Warning: SoapClient::SoapClient() [soapclient.soapclient]: I/O warning : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE on line 80 Fatal error: Uncaught SoapFault exception: [WSDL] SOAP-ERROR: Parsing WSDL: Couldn't load from 'https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl' : failed to load external entity "https://webservices.chargepointportal.net:8081/coulomb_api_1.1.wsdl" in PATH_TO_FILE:80 Stack trace: #0 /home2/wingstec/public_html/widget/API/index.php(80): SoapClient-SoapClient('https://webserv...', Array) #1 {main} thrown in PATH_TO_FILE on line 80 It seems that error says file not exist at the path we given but when we run that path directly on browser then we're getting that file Can anyone help me to figure out what the exactly problem is?

    Read the article

  • Gravity Forms not loading under https, jQuery is not defined

    - by cmykrgbb
    I am using Gravity Forms on my Wordpress site, and so far so good. The problem is I have made the page secure (https/SSL), and this is making the form not to work. It looks like the issue is how the site is trying to load jQuery. There are 23 JS errors on the page, which seem to be due to a failed jQuery load "Uncaught ReferenceError: jQuery is not defined". If I go to the page where the source is trying to pull the jQuery file, you'll see the error:https://code.jquery.com/jquery-1.7.1.min.js?ver=3.4.2 Screenshot of the error: https://www.evernote.com/shard/s212/sh/326f95d6-a498-4c33-b413-7e968225cc79/c2e380ed0fa02a913f712005c8301185 And this screenshot is the reference in the page source: https://www.evernote.com/shard/s212/sh/ae547962-c017-4321-90a2-c51433e59262/124ae116f2b803771f4eb36c90b5a524 So I have been told I'd want to look into that - that's where the ultimate issue is, but I don't really know what to do next. Is it failing because of Gravity Forms, the HTTPS plugin from Wordpress, my SSL certificate...? Thanks in advance!

    Read the article

  • Is it true that quickly closing a webpage opened from a search engine result can hurt the site's ranking?

    - by Austin ''Danger'' Powers
    My web designer recently told me that I need to be careful not to Google for my business' website, click on its search result link, then quickly close the page (or click back) too many times. He says "Google knows" that I didn't stay on the page, and could penalize my site for having a high click-through rate if it happens too much (it was something along those lines- I forget the exact wording). Apparently, it could look like the behavior of a visitor who was not interested in what they found (hence the alleged detrimental effect on the site's search ranking). This sounds hard to believe to me because I would not have thought any information is transmitted which tells Google (or anyone, for that matter) whether or not a website is still open in a browser (in my case Firefox v25.0). Could there possibly be any truth to this? If not, why might he have come to this conclusion? Is there some click-tracking or similar technology employed by search engines which does something similar? Looking forward to hearing everyone's thoughts.

    Read the article

  • IIS Not Accepting Active Directory Login Credentials

    - by Dale Jay
    I have an ASP.NET web form using Microsoft's boilerplate Active Directory login page, set up exactly as suggested. Windows Authentication is activated on the "Default Website" and "MyWebsite" levels, and Domain\This.User is given "Allow" access to the site. After entering the valid credentials for This.User on the web form, a popup window appears asking me to enter my credentials yet again. Despite entering valid credentials for This.User (after attempting Domain\This.User and This.User formats), it rejects the credentials and returns an unauthorized security headers page (error 401.2). Active Directory user This.User is valid, the IP address of the AD server has been verified and SPN's have been set up for the server. Error Code: 0x80070005 Default Web Site security config: <system.web> <identity impersonate="true" /> <authentication mode="Windows" /> <customErrors mode="Off" /> <compilation debug="true" /> </system.web> Sub web site security <authentication mode="Windows"> <forms loginUrl="~/logon.aspx" timeout="2880" /> </authentication> <authorization> <deny users="?" /> <allow users="*" /> </authorization>

    Read the article

  • advertising servers / advert delivery solutions for C#/Asp.Net

    - by Karl Cassar
    We have a website which we want to show adverts in - However, these are custom adverts uploaded by the webmaster, not the Google adverts, or any adverts the network chooses. Ideally, there would be both options. We were considering developing our own advert-management system, but looking at the big picture, it might be better to consider other alternatives. Website is currently developed in C# / ASP.Net (Web Forms) Are there any recommendations to some open-source delivery networks and/or external hosted advert delivery networks? Personally I've used Google's DFP, however sometimes it is not so easy to get a Google AdSense account approved, especially while developing a new website and it not yet being launched. Not sure if this is the best place to ask this kind of question!

    Read the article

  • Sudden increase in spam report from Yahoo

    - by lulalala
    Recently we experienced a sudden increase in spam reports, and all of them come from Yahoo email addresses. We see lots of registration confirmation email got marked as spam. We also saw people marking mails as spam and then opened it and clicked on the confirmation link. We send around 150 registration emails a day, and currently sees 2 spam reports from these per day. Previously spam reports once come once a month. We use Sendgrid to send emails.

    Read the article

  • Benchmark for website speed optimization

    - by gowri
    I working on website speed optimization. I mostly used 3 tools for analyzing speed of optimization. Speed analyzing Tools: Google pagespeed tool Yslow Firefox extenstion Web Page Performance Test I am measuring performance using above tool and benchmark result as below like before and after. Before optimization : Google PageSpeed Insights score : 53/100 Web Page Performance Test : 55/100 (First View : 10.710s, Repeat view : 6.387s ) Yahoo Overall performance score : 68 Stage 1 After optimization : Google PageSpeed Insights score : 88/100 Web Page Performance Test : 88/100 (First View : 6.733s, Repeat view : 1.908s ) Yahoo Overall performance score : 80 My question is ? Am i doing correct way ? What is the best way of benchmark for speed optimization ? Is there any standard ? Is there any much better tool for analyzing speed ?

    Read the article

  • Server Firewall preventing sending of email [migrated]

    - by Jo Fitzgerald
    The firewall on my VPS appears to be preventing my site from sending email. It was working fine until the end of last month. My hosting provider (Webfusion) has been next to useless. I am able to send email if I open INPUT ports 32768-65535, but not if these ports are closed. Why would this be? I have the following rules in my firewall: # sudo iptables -L Chain INPUT (policy DROP) target prot opt source destination VZ_INPUT all -- anywhere anywhere Chain FORWARD (policy DROP) target prot opt source destination VZ_FORWARD all -- anywhere anywhere Chain OUTPUT (policy DROP) target prot opt source destination VZ_OUTPUT all -- anywhere anywhere Chain VZ_FORWARD (1 references) target prot opt source destination Chain VZ_INPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere tcp dpt:www ACCEPT tcp -- anywhere anywhere tcp dpt:https ACCEPT tcp -- anywhere anywhere tcp dpt:smtp ACCEPT tcp -- anywhere anywhere tcp dpt:ssmtp ACCEPT tcp -- anywhere anywhere tcp dpt:pop3 ACCEPT tcp -- anywhere anywhere tcp dpt:domain ACCEPT udp -- anywhere anywhere udp dpt:domain ACCEPT tcp -- anywhere anywhere tcp dpts:32768:65535 ACCEPT udp -- anywhere anywhere udp dpts:32768:65535 ACCEPT tcp -- localhost.localdomain localhost.localdomain ACCEPT udp -- localhost.localdomain localhost.localdomain Chain VZ_OUTPUT (1 references) target prot opt source destination ACCEPT tcp -- anywhere anywhere ACCEPT udp -- anywhere anywhere The VPS is running Plesk 10.4.4 (please ask if you require further technical information to help me)

    Read the article

  • Use alpha or opacity on a table row using CSS [migrated]

    - by mserin
    I have a CSS stylesheet for a webpage. The webpage has a table with a background color of white (set in the rows, not the table). I would like to set the opacity or alpha to 50%. I have tried so many variations, but come up with no luck. A typical row in the HTML file is: <tr> <td>&nbsp;</td> <td>Twitter</td> </tr> The CSS settings for table rows (which works perfectly) is: tr { font-family: Arial, Helvetica, sans-serif; background:rgb(255,255,255); } To get the alpha, I tried tr { font-family: Arial, Helvetica, sans-serif; background-color:rgba(255,255,255,0.5); } I have also tried background-color-opacity: 0.5; Any other suggestions?

    Read the article

  • Advice On Price Comparison Affiliate Programs

    - by pixelcook
    I want a price comparison feature on my site similar to Consumer Reports' "Price & Shop" section. They use PriceGrabber.com, but as far as I can tell they have a special deal with CR, so I can't get a similar service for my site. I've gathered that I need to use an affiliate network, but the whole thing seems so shady, I don't really know what sites are legit, and I don't know what sites offer the price comparison feature. Datafeedfile.com comes up a lot during my searches, but the ugly site makes me wary. Does anyone have any experience with this? What affiliate networks do you recommend? Or should I be looking at something else altogether?

    Read the article

  • good/bad idea to use email address in php session variable? [closed]

    - by Stephan Hovnanian
    I'm developing some additional functionality for a client's website that uses the email address as a key lookup variable between various databases (email marketing system, internal prospect database, and a third shared DB that helps bridge the gap between the two). I'm concerned that storing a visitor's email address as a $_SESSION variable could lead to security issues (not so much for our site, but for the visitor). Anybody have suggestions or experience on whether this is okay to do, or if there's another alternative out there?

    Read the article

  • Numbered list with subclauses

    - by Barry Clearwater
    I'm trying to create a legal document with decimal numbered subclauses, then alpha and roman subsub and subsubsub clauses. (whew!) `1. MAIN HEADING 1.1 This is an example of a sub-clause and you can see that even though the words continue on to the right, it would be best if it wrapped around and formed a block to the right of the decimal number 1.2 In doing so the normal second clause should also wrap around but the second subsequent clause should hang in from the left and be in a block. See below for the remaining clauses (a) this list is completely for demonstration and should not be construed as legal language in any way, nor should make sense in that (b) should the indentation take more than: i) this many lines it would be overly big 11) legal numbering continues in the sub-sub clauses with the use of lower roman lettering and should flow below in a block iii) and continue the formatting on to the next line but be underneath the body of the the text and not begin directly below the number itself. In this example the text carries out to the right but I need it to wrap around underneath. Sorry its so wordy, need this to show the example. 2. Second Clause Heading 2.1 and so it continues thus I've found the examples for decimal numbering but they do not create a block out to the right of the number, and they carry on with multiple decimals rather than alpha and roman sub clauses.

    Read the article

  • Google Webmaster Tools, DNS Errors & HostPapa

    - by Gravy
    Received a message from Google Webmaster Tools: Over the last 24 hours, Googlebot encountered 2 errors while attempting to retrieve DNS information for your site. The overall error rate for DNS queries for your site is 40.0%. You can see more details about these errors in Webmaster Tools. Recommended action Contacted HostPapa and they deny that there is any issue with the site / server!!! Support in terms of what I can do to actually resolve this issue is non-existent!!!! The site is currently online. And I don't know much about DNS... so any advice about what I can do to resolve this problem would be much appreciated. Basically, the message from Google says that it is my webhosts fault, the message from my webhost (HostPapa) is... "Just tell google to crawl your site as there are no errors."

    Read the article

  • URL blocked in robots.txt but still showing up on Google search [closed]

    - by Ahmad Alfy
    Possible Duplicate: Why do Google search results include pages disallowed in robots.txt? In my robots.txt I am disallowing a lot of URLs. Google webmaster tools says there're +750 URL blocked. The problem is the URLs are still showing on Google search. For example I have the following rule: Disallow: /entity/child-health/ But when I search some-keyword + child health the following URL shows up : http://www.sitename.com/entity/child-health/ Am I doing anything wrong? Is is possible for a URL to be blocked using robots.txt and still show up on search results?

    Read the article

< Previous Page | 72 73 74 75 76 77 78 79 80 81 82 83  | Next Page >