Search Results

Search found 24378 results on 976 pages for 'pinned site'.

Page 20/976 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Video for an ads-driven web-site

    - by AntonAL
    I have a website, wich i will fill with a bunch of useful videos. I've implemented an ads rotation engine for articles and will do so for videos. The next milestone is to decide, how video will be integrated. They are two ways: To host videos myself. Pros: complete freedom. Cons: need tens of gigabytes of storage; support for multiple formats to be crossbrowser and crossdevice. Use Youtube. Pros: Very simple to use; nothing to do. What are pros and cons for each way ? Some questions for YouTube: Will i be able to control playback of YouTube-embedded video to make post-rolls ? What is ranking impact on my web-site, when most of pages will refer to YouTube ? Will, say, iPad play video, embedded via YouTube's iframe ? Does relying entirely on YouTube have a long-term perspective for a web-site, that should bring money ?

    Read the article

  • Can .htaccess slow down a site?

    - by Cody Sharp
    I'm working with a client on an e-commerce website. I implemented clean URLs using .htaccess. I also used .htaccess to solve canonical issues such as redirecting www to non-www and removing index.php from the URL. The website recently began to slow down dramatically, sometimes not even loading. The site is hosted on GoDaddy, and when the client called GoDaddy they told him it was the .htaccess file slowing down the website. I find this highly unlikely because of my past experiences, but I'm not 100% sure. My thinking is that the client's website is most likely on a shared server with a busy neighborhood, thus slowing down the site. It's not always slow, but rather sporadic throughout the day, loading fast at some points and slow at other points in time. Can the .htaccess file slow down a website to a crawl? If so, are there better ways to solve these problems with different rewrite rules and such? Here is what the actual .htaccess file looks like: Options +FollowSymlinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^www.example.net [NC] RewriteRule ^(.*)$ http://example.net/$1 [L,R=301] RewriteRule ^products/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php p=product&product_code=$1 [L] RewriteRule ^catalog/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php p=catalog&catalog_code=$1 [L] RewriteRule ^pages/([0-9a-zA-Z\_\-]*)\.htm([l]?)$ index.php?p=page&page_id=$1 [L] RewriteRule ^index\.htm([l]?)$ index.php?p=home [L] RewriteRule ^site_map\.htm([l]?)$ index.php?p=site_map [L] RewriteCond %{QUERY_STRING} ^p=home$ RewriteRule (.*) ? [R=permanent] I'm a .htaccess and regex novice, so any pointed out mistakes would also help. Thank you.

    Read the article

  • Site migration and SEO impact

    - by John Smith
    I'd greatly appreciate a response on the following question relating to site migration and SEO impact. Here's some background on how my domain name and site is currently configured: My domain name provider has the following settings: host name @ is an A NAME record and points to IP address x.x.x.x host name www is an A NAME record and points to IP address x.x.x.x sub-domain host name new.example.com is an A NAME record and points to IP address x.x.x.x My hosting provider has the following settings: host record @ is an A NAME record and points to IP address x.x.x.x, folder home/public_html/old host record www is a C NAME record and points to example.com sub-domain host record new.example.com points to home/public_html/new I want to: point the domain (example.com AND www.example.com) to the content hosted under folder home/public_html/new, which is currently the content directory for new.example.com retire the content hosted under folder home/public_html/old retire the sub-domain host record new.example.com I believe the easiest method of doing this, is: removing the sub-domain host record new.example.com; and changing the following line in the .htaccess file in home/public_html from # Change 'subdirectory' to be the directory you will use for your main domain. RewriteCond %{REQUEST_URI} !^/old/ to # Change 'subdirectory' to be the directory you will use for your main domain. RewriteCond %{REQUEST_URI} !^/new/ But I don't understand how this will impact my SERP - ideally, I'd like it to remain the same. Research on this topic resulted in the following Google page, which was no help, and this related StackExchange question, which suggests that this should not affect my SERP (at least, not permanently). But I wanted to make certain with a more specific example, and hopefully contribute to the community at the same time. I'd appreciate any feedback on this. Is there a better/recommended method to migrate sites this way? Is there an SEO impact?

    Read the article

  • Domain forwarding to a IE "trusted site" opens a blank page

    - by Michael Jasper
    My employer, a University, regularly hosts conferences and other events. While websites for these sites are hosted on our domain, they frequently request customized .com urls. We then forward these domains to the specific site. Recently, we discovered a problem, where a page will not load if the following conditions are met(using a current example): website is created on our CMS for a conference http://continue.weber.edu/nulc a domain is created http://www.nulc2012.com and forwarded to http://continue.weber.edu/nulc The user enters http://www.nulc2012.com into their address bar using IE7 or IE8 The user has *.weber.edu listed as a "trusted site" in IE security settings (the case for nearly all on-campus computers) When this happens, their browser will correctly transfer to the page http://continue.weber.edu/nulc/index.php, however the page is blank, returning only: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN"> <HTML><HEAD> <META content="text/html; charset=windows-1252" http-equiv=Content-Type></HEAD> <BODY></BODY></HTML> Is there any know solution to this problem? Or am I missing something completely? Note: Tested websites do load correctly in Chrome, Firefox, and Safari

    Read the article

  • Allow access to WordPress site only by links in email newsletters

    - by Shane
    I send out a personal email newsletter, and have been looking into sending it via some service like MailChimp, or sendy.co. Many of these email services suggest, or require, the email newsletter content to be available online, in case the recipient's email app doesn't render it properly, or at all. The thing is I don't want my newsletter contents visible to the whole world. Nor do I want to require existing recipients to make accounts/be assigned accounts, with passwords. So, the question is: How can my WordPress site content be viewable only by clicking on the link to it in the email newsletter. It can't be found in a Google search; but once at the site the visitor can view previous newsletter contents. It seems an .htaccess file would do the trick, but I have been unable to figure out the syntax for this. Thanks for your help. I have copied below two other questions, and answers, which have helped me word my question clearly. Similar to this request about allowing access to a certain group while still restricting access to the world: Is there a way to password protect directory only in cpanel. But the user should not be prompted the password, when they try to access it via web? This persons question is the closest I could find to my situation: Restrict direct folder access via .htaccess except via specific links

    Read the article

  • Upcoming GWB Site Maintenance & Downtime This Weekend

    - by Staff of Geeks
    We'll be performing routine maintenance and a code release this weekend, from late Saturday night to early Sunday morning. There will be moments of site downtime but we'll minimize this as much as possible of course. We intend for the following fixes & features to go to production: Over 30 Windows Update hotfixes & security updatesBug Fix: Homepage of GWB currently listing posts by create date, but should be listing by first-time publish date. Thanks to Chris Gardner for alerting us about this. Bug Fix: Broken thumbnail images in the Hot Topics and Most Popular areas. Thanks to .ToString(theory) for emphasizing this one. Bug Fix: Not able to create/edit posts in the admin tool using IE 10. (Thanks Benny Matthew)Bug Fix: Admin blog post rich text editor not working in IE 10. Bug Fix: New Twitter connections cannot be established because the twitter API URL has changed. Feature: New "Minimal" Template using fluid Twitter Bootstrap/Cerulean theme. Feature: Integration with AirBrake exception handling.Feature: Change bio pics in the GWB main feed to be hyperlinked.Feature: Change hyperlink of MVP icons in the GBW Blogger List area to go directly to the Microsoft MVP search results page for that MVP's name. Thanks once again for your patience as we strive to improve the site!Ben BarrethGeeksWithBlogs Community Builder/Software Developer

    Read the article

  • What to choose for a multilingual site with support for Markdown and commenting

    - by Kent
    I want to publish articles at a multilingual site. I want to be able to write an article in two languages and have them available on separate URLs: thesite.foo/english-breakfast thesite.com/engelsk-frukost If the users web browser is set to English I'd like to show a small notice at the top of the Swedish version with a link to the English one. The link should have an appropriate rel attribute for a translation (search for hreflang at http://diveintohtml5.org/semantics.html). There should be a way to list all articles belonging to these sets: Swedish only, English only, Swedish versions + English only, English versions + Swedish only. I'd like to publish these as four RSS-feeds. And I would like to have two versions of the main site, one in Swedish (showing Swedish versions + English only) and one in English (showing English versions). I shall be able to write the articles using Markdown, as that is the formatting language I find most convenient. There should be a way for users to comment. And some kind of way for me to protect myself against comment spam. I am leaning towards learning Drupal. I suspect I'll have to code this behavior myself as a module. To be frank I'd rather work with Java. Is Drupal the way to go? Or is there something more suitable for this project?

    Read the article

  • How do I trust an off site application

    - by Pieter
    I need to implement something similar to a license server. This will have to be installed off site at the customers' location and needs to communicate with other applications at the customers' site (the applications that use the licenses) and an application running in our hosting center (for reporting and getting license information). My question is how to set this up in a way I can trust that: The license server is really our application and not something that just simulates it; and There is no "man in the middle" (i.e. a proxy or something that alters the traffic). The first thing I thought of was to use with client certificates and that would solve at least 2. However, what I'm worried about is that someone just decompiles (this is build in .NET) the license server, alters some logic and recompiles it. This would be hard to detect from both connecting applications. This doesn't have to be absolutely secure since we have a limited number of customers whom we have a trust relationship with. However, I do want to make it more difficult than a simple decompile/recompile of the license server. I primarily want to protect against an employee or nephew of the boss trying to be smart.

    Read the article

  • Anonymouse VS Logged in users on my site & Google Analytics

    - by Flowpoke
    I'd like to be able to run two different 'tracks' for Google Analytics; One for anonymous users of the site and another for Users whom are logged-in. I say "track" because Im not sure of the term--but I definitely know I want it to all be in the same "Analytics Account", I just want to segregate my logged-in users... In the site template, I can very easily add a conditional to display one or the other (Analytics code snippet)... Which Im hoping this comes down to and although Im not sure, it seems that the last digit in your Analytics ID (e.g. UA-15XXXX0-X) could be incremented to gain such additional 'tracks'....? Any tips? Am I doin it wrong? My current footer snippet: <script type="text/javascript"> var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); </script> <script type="text/javascript"> try { var pageTracker = _gat._getTracker("UA-XXXXXXX-1"); pageTracker._trackPageview(); } catch(err) {} </script>

    Read the article

  • Joomla: Cross site link boxes

    - by Dean Smith
    I am currently evaluating Joomla for use on the the rebuild of our corporate website. One of the features our designs have is spaces for what we've been calling widgets. These widgets include common ones for each page, contact us boxes, section navigation, that all seem pretty easy to implement. The other widgets we would us to highlight specific pages within the site relevant to the one you are on, and are therefore different for each article. We've called these link widgets and these are a pretty common site on many websites. I'm honestly at a bit of a loss as to how to do this with Joomla. From a CMS perspective I'd like to allow users to just select which three link widgets they want when editing the article and have some way of creating the widgets within the CMS as well. Editing the link widget would allow setting a title, a bit of text and selecting the page that it links to. Am I going to be able to do this with Joomla, and if I can roughly how to I go about it ?

    Read the article

  • Joomla 1.6 site cannot add a new extension through admin interface

    - by Ghlouw
    I'm having a very frustrating problem with my Joomla 1.6 site. I cannot add any new extensions through the admin interface. I have tried to upload the extension, or to use the search folder option or even the direct link. Neither of these options work, and all that happens is that the page tries to load forever until it finally timesout with a blank white page (no further error messages). I have tried this with multiple browsers (Chrome,FF,IE) and I have tried it with different extensions (modules, components, templates - all the same result). So I don't think it has anything to do with what I am uploading, but more likely the problem is something with the post action. I have also seen the exact same error occur when I try to update menu items or even create new menu items. I am not getting this error with a duplicate of the site in the dev environment, but only get this on my shared web hosting live server. This is on a Windows IIS / PHP / mySQL environment. Any help would be much appreciated!

    Read the article

  • How to handle multiple pages of the same site with the same outlinks

    - by pandafromchina
    I am developing a back link tool for Chinese SEO (our web site URL is: http://link.aizhan.com just like ahrefs.com. I encountered a problem which is how to handle multiple pages of the same site with the same out links. For example: Most pages of bbs.chinaz.com have the same out links such as: bbs.chinaz.com/Tea/thread-6293993-1-1.html bbs.chinaz.com/Tea/list-1.html bbs.chinaz.com/alimama/thread-6265032-1-1.html bbs.chinaz.com/alimama/thread-6265032-2-1.html?userid=-1&extParms= bbs.chinaz.com/Shuiba/list-1.html bbs.chinaz.com/FeedBack/thread-4456753-1-1.html etc.. All of the pages have the same out links in the top of the page: www.cnzz.com(anchor text:????) www.313.com(????) www.idc123.com(????) Suppose I store these outlinks into database. The SEO will find there are six backlinks from bbs.chinaz.com of www.cnzz.com. This is obviously no sense for the SEO. Can you tell me how do you deal with this problem?

    Read the article

  • Content light website and Google - Tell google it's a listings site (as opposed shop, reviews or restaurants)

    - by Doug Firr
    I have a listings style website. Due to the nature of this (listings) the site is content light. Each page is typically less that 50 words but there are many pages. The site in question has had a ton of media coverage and so has some great inbound links from places like Wired, Fast Company, Canada Broadcasting Corporation and many many other bloggers, media websites and recycle related niche authors (It's a recycling site). But Google really ignores it. Traffic from search is very very low - less than 5% of all traffic. I know that using markup you can tell Google whether your site is a restaurant, article, review, shop, local business and a few other categories (https://www.google.com/webmasters/markup-helper/u/0/). Is there a way to tell Google that my site is a listings site? I suspect, but do not know for sure, that part of the problem is that Google simply does not know what my site is? It's a crowdmap where people post curbalerts. The information is useful to people but it is presented in a short, concise way - a pin on a map, a picture and a short description. Adding anything further is not necessary for the site's intended purpose. 1st question - how best to tell the search engines what y site is - listings and not some spammy website? Any recommendations in improving our site's Search presence? You can take a look here if interested: http://tinyurl.com/lxg4hn7

    Read the article

  • Web Site Performance and Assembly Versioning – Part 2 Versioning Combined Files Using Subversion

    - by capgpilk
    Ok so it took a while to post this second part. Many apologies, we had a big roll out of a new platform at work and many things had to get sidelined. So this is the second part in a short series of website performance and using versioning to help improve it. Minification and Concatination of JavaScript and CSS Files Versioning Combined Files Using Subversion – this post Versioning Combined Files Using Mercurial – published shortly In the previous post we used AjaxMin to shrink js and css files then concatenated them into one file each which had the file name of site-script.combined.min.js and site-style.combined.min.css. These file names are fine, but you can configure IIS 7 to cache these static files and so lower the amount of data transferred between server and client. This is done by editing the response headers in IIS. 1. In IIS7 Manager, choose the directory where these files are located and select HTTP Response Headers. 2. Check the Expire Web Content and set a time period well into the future. 3. When refreshing the web page, the server will respond with HTTP 304 forcing the browser to retrieve the file from its cache. 4. As can be seen in FireBug, the Cache-Control header has a max age of 31536000 seconds which equates to 365 days.   The server will always send this HTTP 304 message unless the file changes forcing it to send new content. To help force this we can change the file name based on the latest build using the SVN revision number in the filename. So we have lowered data transfer on content that hasn’t changed, but forced it to be sent when you have made a change to the css or js files. Now to get the SVN revision number in to the file name. 1. Import the MSBuildCommunityTasks targets which can be dowloaded from here. 1: <Import Project="$(MSBuildExtensionsPath) 2: \MSBuildCommunityTasks 3: \MSBuild.Community.Tasks.Targets" /> 2. Edit the BeforeBuild target to call out to svn and get the latest revision 1: <SvnVersion LocalPath="$(MSBuildProjectDirectory)" 2: ToolPath="$(ProgramFiles)\VisualSVN Server\bin"> 3: <Output TaskParameter="Revision" PropertyName="Revision" /> 4: </SvnVersion> 3. Set it to update the project AssemblyInfo.cs file for the svn revision. 1: <FileUpdate Files="Properties\AssemblyInfo.cs" 2: Regex="(\d+)\.(\d+)\.(\d+)\.(\d+)" 3: ReplacementText="$1.$2.$3.$(Revision)" /> 4. Now edit the AfterBuild target to get the full dll version. You could combine these two steps and just get the version from svn, I am working on one project that updates the AssemblyInfo file and another project that allows manual editing of the file, but needs that version within the file name; so I just combined the two for this post. 1: <MSBuild.ExtensionPack.Framework.Assembly 2: TaskAction="GetInfo" 3: NetAssembly="$(OutputPath)\mydll.dll"> 4: <Output TaskParameter="OutputItems" ItemName="Info" /> 5: </MSBuild.ExtensionPack.Framework.Assembly> 6: <Message Text="Version: %(Info.AssemblyVersion)" 7: Importance="High" /> 5. Use this Info.AssemblyVersion to write out the combined css and js files as described in the last post. 1: <WriteLinestoFile File="Scripts\site-%(Info.AssemblyVersion).combined.min.js" 2: Lines="@(JSLinesSite)" Overwrite="true" />   In the next post I will cover doing the same, but for a Mercurial repository.

    Read the article

  • Puppetmaster don't notice changes to site.pp

    - by tore-
    Hi, I've just setup a new production environment with puppet. Using 0.25.4 in client/server. Ruby is at 1.8.5, CentOS 5.4. I've made a simple manifest for configuring yum-updatesd, but the puppetmaster doesn't seem to notice changes done to site.pp: err: Could not parse for environment production: Could not match 'node' at /etc/puppet/manifests/site.pp:1 err: Could not retrieve catalog from remote server: Error 400 on SERVER: Could not parse for environment production: Could not match 'node' at /etc/puppet/manifests/site.pp:1 Notice, it says line 1. But line 1 contains an import statement: # cat -n /etc/puppet/manifests/site.pp 1 import "update-notification" 2 3 node default { 4 include update-notification 5 update-notification::configure() 6 } I've tried to reboot the server, delete and recreate site.pp, start and stop puppetmaster and puppet, with no luck. What am I missing?

    Read the article

  • Lock down a site using Forms Auth in IIS7 with Windows Auth

    - by Josh
    I have an ASP.NET MVC 1.0 application that uses Forms Authentication. We are using Windows Server 2008. I need to lock down the site so that only certain users (in AD Groups) can access the site. Unfortunately, though, when I set the site to not allow anon users and use windows authentication, due to the integration of the site and IIS, it shows the user as signed in as their domain account, instead of allowing them to sign in through Forms Auth. So, I need a mixed mode authentication. I need the site to be only accessible through windows auth, without anon users, but once you are in, it needs to use forms auth only. How would I go about doing this the right way?

    Read the article

  • Puppetmaster don't notice changes to site.pp

    - by tore-
    I've just setup a new production environment with puppet. Using 0.25.4 in client/server. Ruby is at 1.8.5, CentOS 5.4. I've made a simple manifest for configuring yum-updatesd, but the puppetmaster doesn't seem to notice changes done to site.pp: err: Could not parse for environment production: Could not match 'node' at /etc/puppet/manifests/site.pp:1 err: Could not retrieve catalog from remote server: Error 400 on SERVER: Could not parse for environment production: Could not match 'node' at /etc/puppet/manifests/site.pp:1 Notice, it says line 1. But line 1 contains an import statement: # cat -n /etc/puppet/manifests/site.pp 1 import "update-notification" 2 3 node default { 4 include update-notification 5 update-notification::configure() 6 } I've tried to reboot the server, delete and recreate site.pp, start and stop puppetmaster and puppet, with no luck. What am I missing?

    Read the article

  • Lock down a site using Forms Auth in IIS7 with Windows Auth

    - by justjoshingyou
    I have an ASP.NET MVC 1.0 application that uses Forms Authentication. We are using Windows Server 2008. I need to lock down the site so that only certain users (in AD Groups) can access the site. Unfortunately, though, when I set the site to not allow anon users and use windows authentication, due to the integration of the site and IIS, it shows the user as signed in as their domain account, instead of allowing them to sign in through Forms Auth. So, I need a mixed mode authentication. I need the site to be only accessible through windows auth, without anon users, but once you are in, it needs to use forms auth only. How would I go about doing this the right way?

    Read the article

  • Apply SharePoint template to existing site?

    - by johnnyb10
    I have several similar SharePoint sites (running on WSS 3) and I have saved one of the sites as a template. I now want to make a different site (which already exists) have the same structure as this site--the same lists, document libraries, views, etc. I know I can delete the existing site and then recreate it based on this template, but is there a way to apply this template to my existing site, so that it gets rid of its existing lists, etc., and replaces them with the ones from the template? I don't have any content in the site, and I don't want to keep any of the existing structures, so I don't care if anything gets swept away. I may need to do this with a bunch of sites in the future, so being able to apply the template rather than recreating from scratch might be very helpful.

    Read the article

  • IIS load balancing and site deployment

    - by KLC
    Hi, currently I have a site sits on one IIS7 server. When we deploy a new version of the site, we bring the site down and display an offline page. What I really want is have two same exact copies of the site sits in one IIS 7 server and load balance users among both sites. when we deploy a new version of the site, we will bring site1 down (users in site1 automatically routes to site2 on next postback), when site1 deployment is complete, bring site2 down (users in site2 being routes to site1 on next postback). is this even possible?

    Read the article

  • Suddenly blocked from a site

    - by Diego Romero
    Suddenly from a time to now I haven't been able to go to a site I used to go frequently for maintenance (Wordpress). I tried different browsers, restarting my laptop, clearing cache, history, cookies. Also did a ping to the site ip, go 4 packets send and 4 lost. This is a problem I think with only my laptop, since I've been able to go into the site from other devices in the same network. I have also tried connecting to the same site from a completely different network with the same problem. I really don't know what to do about this, any advices? PS: site hosted in wp engine if that has anything to do with this problem.

    Read the article

  • Changing Admin Site URL (actually port) - how?

    - by TomTom
    I have a new install of the band new SharePoint 2010. I use host header identified site collections for everything. By default the admin site is on a random port. I would like to move the admin site to port 80, for the server name. As all sites have coded names (for example "intranet", "projects") this would allow administration via the server name - which is easier as external access does not have to remember the port number. How do I do this? I already changed the default URL, but the site (application) is still wrongly mapped. I dont find anything to change the IIS settings in the admin site. I possibly just miss it - so can anyone point me in the right direction?

    Read the article

  • moving my site, IP change worries...

    - by Sherif Buzz
    Hi all, my site has outgrown the shared hosting account it's on and i've setup a VPS that i'll be moving to soon. I cannot keep the same IP between my new account and the old one and I'm a bit at loss as to how to minimize user downtime while the new IP is reflected in all DNS caches. Note I cannot have the site running on both accounts at the same time as it's a dating site and this would cause data inconsistency. Here's what i am planning to do : Put up a 'under maintenance' page on old host Get the site up and running on new host, and update domain to point to new host. Hope downtime isn't too long. Would it be a good idea to have a link on the page in (1) that opens the new site but using it's ip ? Or even redirect all requests at the old host, to the new one (again by ip) ? Any advice much appreciated.

    Read the article

  • What should we tell our unsupported IE6 users?

    - by Dan Fabulich
    In the upcoming version of our web app, we've broken IE6, and we don't intend to fix it. We've had a clear warning posted for IE6 users for some months; we've decided it's time not to support it. My question is: how should we communicate this to our users? Some people here feel that we should block IE6 users who would try to access the web app, because it's not going to work for them. Others feel that we should just leave up a warning, saying "This doesn't work in IE6," but not block them; instead, if they click to dismiss the warning, just let them in to the broken site to see for themselves that it doesn't work. Who is right? Is there a better way?

    Read the article

  • What is the SEO impact of moving my domain to another IP address and what is the right way of doing this?

    - by ElHaix
    I am planning to move several websites to a new hosting provider - keeping the same URL but will resolve to different IP addresses. For example, some sites are Canadian content-only sites, hosted on .CA domains sitting on Canadian IP addresses. I want to move these to Amazon servers which have US IP addresses. The domain names will remain the same. (1) What is the SEO impact of this? (2) Will the site lose some ranking if the sites are moved to a new IP address (Canadian or not), and if so, what is the cleanest way of accomplishing this (some kind of 301's)?

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >