Search Results

Search found 8020 results on 321 pages for 'webcenter sites'.

Page 11/321 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • .NET 4.5 now supported with Windows Azure Web Sites

    - by ScottGu
    This week we finished rolling out .NET 4.5 to all of our Windows Azure Web Site clusters.  This means that you can now publish and run ASP.NET 4.5 based apps, and use  .NET 4.5 libraries and features (for example: async and the new spatial data-type support in EF), with Windows Azure Web Sites.  This enables a ton of really great capabilities - check out Scott Hanselman’s great post of videos that highlight a few of them. Visual Studio 2012 includes built-in publishing support to Windows Azure, which makes it really easy to publish and deploy .NET 4.5 based sites within Visual Studio (you can deploy both apps + databases).  With the Migrations feature of EF Code First you can also do incremental database schema updates as part of publishing (which enables a really slick automated deployment workflow). Each Windows Azure account is eligible to host 10 free web-sites using our free-tier.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today. In the next few days we’ll also be releasing support for .NET 4.5 and Windows Server 2012 with Windows Azure Cloud Services (Web and Worker Roles) – together with some great new Azure SDK enhancements.  Keep an eye out on my blog for details about these soon. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • .NET 4.5 agora é suportado nos Web Sites da Windows Azure

    - by Leniel Macaferi
    Nesta semana terminamos de instalar o .NET 4.5 em todos os nossos clusters que hospedam os Web Sistes da Windows Azure. Isso significa que agora você pode publicar e executar aplicações baseadas na ASP.NET 4.5, e usar as bibliotecas e recursos do .NET 4.5 (por exemplo: async e o novo suporte para dados espaciais (spatial data type no Entity Framework), com os Web Sites da Windows Azure. Isso permite uma infinidade de ótimos recursos - confira o post de Scott Hanselman com vídeos (em Inglês) que destacam alguns destes recursos. O Visual Studio 2012 inclui suporte nativo para publicar uma aplicação na Windows Azure, o que torna muito fácil publicar e implantar sites baseados no .NET 4.5 a partir do Visual Studio (você pode publicar aplicações + bancos de dados). Com o recurso de Migrações da abordagem Entity Framework Code First você também pode fazer atualizações incrementais do esquema do banco de dados, como parte do processo de publicação (o que permite um fluxo de trabalho de publicação extremamente automatizado). Cada conta da Windows Azure é elegível para hospedar até 10 web-sites gratuitamente, usando nossa camada Escalonável "Compartilhada". Se você ainda não tem uma conta da Windows Azure, você pode inscrever-se em um teste gratuito para começar a usar estes recursos hoje mesmo. Nos próximos dias, vamos também lançar o suporte para .NET 4.5 e Windows Server 2012 para os Serviços da Nuvem da Windows Azure (Web e Worker Roles) - juntamente com algumas novas e ótimas melhorias para o SDK da Windows Azure. Fique de olho no meu blog para mais informações sobre estes lançamentos em breve. Espero que ajude, - Scott PS Além do blog, eu também estou agora utilizando o Twitter para atualizações rápidas e para compartilhar links. Siga-me em: twitter.com/ScottGu Texto traduzido do post original por Leniel Macaferi.

    Read the article

  • Taking web sites offline for demonstration

    While working in software development in general, and in web development for a couple of customers it is quite common that it is necessary to provide a test bed where the client is able to get an image, or better said, a feeling for the visions and ideas you are talking about. Usually here at IOS Indian Ocean Software Ltd. we set up a demo web site on one of our staging servers, and provide credentials to the customer to access and review our progress and work ad hoc. This gives us the highest flexibility on both sides, as the test bed is simply online and available 24/7. We can update the structure, the UI and data at any time, and the client is able to view it as it suits best for her/him. Limited or lack of online connectivity But what is going to happen when your client is not capable to be online - no matter for what reasons; here are some more obvious ones: No internet connection (permanently or temporarily) Expensive connection, ie. mobile data package, stay at a hotel, etc. Presentation devices at an exhibition, ie. using tablets or iPads Being abroad for a certain time, and only occasionally online No network coverage, especially on mobile Bad infrastructure, like ie. in Third World countries Providing a catalogue on CD or USB pen drive Anyway, it doesn't matter really. We should be able to provide a solution for the circumstances of our customers. Presentation during an exhibition Recently, we had the following request from a customer: Is it possible to let us have a desktop version of ResortWork.co.uk that we can use for demo purposes at the forthcoming Ski Shows? It would allow us to let stand visitors browse the sites on an iPad to view jobs and training directory course listings. Yes, sure we can do that. Eventually, you might think why don't they simply use 3G enabled iPads for that purpose? As stated above, there might be several reasons for that - low coverage, expensive data packages, etc. Anyway, it is not a question on how to circumvent the request but to deliver a solution to that. Possible solutions... or not? We already did offline websites earlier, and even established complete mirrors of one or two web sites on our systems. There are actually several possibilities to handle this kind of request, and it mainly depends on the system or device where the offline site should be available on. Here, it is clearly expressed that we have to address this on an Apple iPad, well actually, I think that they'd like to use multiple devices during their exhibitions. Following is an overview of possible solutions depending on the technology or device in use, and how it can be done: Replication of source files and database The above mentioned web site is running on ASP.NET, IIS and SQL Server. In case that a laptop or slate runs a Windows OS, the easiest way would be to take a snapshot of the source files and database, and transfer them as local installation to those Windows machines. This approach would be fully operational on the local machine. Saving pages for offline usage This is actually a quite tedious job but still practicable for small web sites Tool based approach to 'harvest' the web site There quite some tools in the wild that could handle this job, namely wget, httrack, web copier, etc. Screenshots bundled as PDF document Not really... ;-) Creating screencast or video Simply navigate through your website and record your desktop session. Actually, we are using this kind of approach to track down difficult problems in order to see and understand exactly what the user was doing to cause an error. Of course, this list isn't complete and I'd love to get more of your ideas in the comments section below the article. Preparations for offline browsing The original website is dynamically and data-driven by ASP.NET, and looks like this: As we have to put the result onto iPads we are going to choose the tool-based approach to 'download' the whole web site for offline usage. Again, depending on the complexity of your web site you might have to check which of the applications produces the best results for you. My usual choice is to use wget but in this case, we run into problems related to the rewriting of hyperlinks. As a consequence of that we opted for using HTTrack. HTTrack comes in different flavours, like console application but also as either GUI (WinHTTrack on Windows) or Web client (WebHTTrack on Linux/Unix/BSD). Here's a brief description taken from the original website about HTTrack: HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. And there is an extensive documentation for all options and switches online. General recommendation is to go through the HTTrack Users Guide By Fred Cohen. It covers all the initial steps you need to get up and running. Be aware that it will take quite some time to get all the necessary resources down to your machine. Actually, for our customer we run the tool directly on their web server to avoid unnecessary traffic and bandwidth. After a couple of runs and some additional fine-tuning - explicit inclusion or exclusion of various external linked web sites - we finally had a more or less complete offline version available. A very handsome feature of HTTrack is the error/warning log after completing the download. It contains some detailed information about errors that appeared on the pages and the links within the pages that have been processed. Error: "Bad Request" (400) at link www.resortwork.co.uk/job-details_Ski_hire:tech_or_mgr_or_driver_37854.aspx (from www.resortwork.co.uk/Jobs_A_to_Z.aspx)Error: "Not Found" (404) at link www.247recruit.net/images/applynow.png (from www.247recruit.net/css/global.css)Error: "Not Found" (404) at link www.247recruit.net/activate.html (from www.247recruit.net/247recruit_tefl_jobs_network.html) In our situation, we took the records of HTTP 400/404 errors and passed them to the web development department. Improvements are to be expected soon. ;-) Quality assurance on the full-featured desktop Unfortunately, the generated output of HTTrack was still incomplete but luckily there were only images missing. Being directly on the web server we simply copied the missing images from the original source folder into our offline version. After that, we created an archive and transferred the file securely to our local workspace for further review and checks. From that point on, it wasn't necessary to get any more files from the original web server, and we could focus ourselves completely on the process of browsing and navigating through the offline version to isolate visual differences and functional problems. As said, the original web site runs on ASP.NET Web Forms and uses Postback calls for interaction like search, pagination and partly for navigation. This is the main field of improving the offline experience. Of course, same as for standard web development it is advised to test with various browsers, and strangely we discovered that the offline version looked pretty good on Firefox, Chrome and Safari, but not in Internet Explorer. A quick look at the HTML source shed some light on this, and there are conditional CSS inclusions based on the user agent. HTTrack is not acting as Internet Explorer and so we didn't have the necessary overrides for this browser. Not problematic after all in our case, but you might have to pay attention to this and get the IE-specific files explicitly. And while having a view at the source code, we also found out that HTTrack actually modifies the generated HTML output. In several occasions we discovered that <div> elements were converted into <table> constructs for no obvious reason; even nested structures. Search 'e'nd destroy - sed (or Notepad++) to the rescue During our intensive root cause analysis for a couple of HTML/CSS problems that needed some extra attention it is very helpful to be familiar with any editor that allows search and replace over multiple files like, ie. sed - stream editor for filtering and transforming text on Linux or my personal favourite Notepad++ on Windows. This allowed us to quickly fix a lot of anchors with onclick attributes and Javascript code that was addressed to ASP.NET files instead of their generated HTML counterparts, like so: grep -lr -e '.aspx' * | xargs sed -e 's/.aspx/.html?/g' The additional question mark after the HTML extension helps to separate the query string from the actual target and solved all our missing hyperlinks very fast. The same can be done in Notepad++ on Windows, too. Just use the 'Replace in files' feature and you are settled. Especially, in combination with Regular Expressions (regex). Landscape of browsers Okay, after several runs of HTML/CSS code analysis, searching and replacing some strings in a pool of more than 4.000 files, we finally had a very good match of an offline browsing experience in Firefox and Chrome on Linux. Next, we transferred that modified set of files to a Windows 8 machine for review on Firefox, Chrome and Internet Explorer 7 to 10, and a Mac mini running Mac OS X 10.7 to check the output on Safari and again on Chrome. Besides IE, for reasons already mentioned above, the results were identical. And last but not least it was about to check web site on tablets. Please continue to read on the following articles: Taking web sites offline for demonstration on Galaxy Tablet Taking web sites offline for demonstration on iPad

    Read the article

  • Chrome Mobile Monthly: Responsive vs Separate Sites

    Chrome Mobile Monthly: Responsive vs Separate Sites Join us on Wednesday October 31st at 9am PT for our Monthly Mobile Web Hangout! This month +Brad Frost will be joining us to talk about responsive design versus separate mobile sites. And in keeping with the season, it's a special Presidential Smackdown Edition. The US presidential race is in full swing, and the candidates are intensely debating the country's hot-button issues. The web design world is entrenched in our own debate about how to address the mobile web: should we create a separate mobile site or create a responsive experience instead? It just so happens that the two US presidential candidates have chosen different mobile web strategies for their official websites. In the red corner is Republican candidate Mitt Romney's dedicated mobile site, while in the blue corner is incumbent president Barack Obama's responsive website. Which will prevail? Sit back, crack open a cold one, and watch the battle unfold as Brad dissect the candidates' sites to uncover best practices and common mobile web pitfalls. From: GoogleDevelopers Views: 0 0 ratings Time: 00:00 More in Science & Technology

    Read the article

  • Unable to view 2 local sites over network

    - by gentrobot
    I have 2 websites running on my local machine that I'd like to view from other machines on the same network. For /etc/apache2/sites-available/site1.com: <VirtualHost *:80> ServerName site1.com DocumentRoot /var/www/answers/app/webroot DirectoryIndex index.php <Directory "/var/www/answers/app/webroot"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost> For /etc/apache2/sites-available/site1.com: <VirtualHost *:80> ServerName site2.com DocumentRoot /var/www/answers2/app/webroot DirectoryIndex index.php <Directory "/var/www/answers2/app/webroot"> Options FollowSymLinks AllowOverride All Order allow,deny Allow from all </Directory> </VirtualHost> I have added 2 entries in the /etc/hosts file as: 127.0.0.1 site1.com 127.0.0.1 site2.com Now, when I point the browser on my machine to site1.com, it shows me the first site and pointing the browser to site2.com, it shows me the second site. However,when I type in the local IP of my machine in the browser, it always shows site2. How can I change it to switch between site1 and site2 ? Is there a way that I can view both the sites form another machine (esp. mobile devices over wireless network) ?

    Read the article

  • Using a CDN for CMS software (multiple sites)

    - by SmokeyPHP
    I'm currently researching ideas for the media management side of a CMS I'm writing. I was looking at having images served from a CDN which is fine on a single site, but I want all sites that run the CMS to make use of a CDN (which will most likely be a custom developed one, rather than a third party service like S3). My main question is: Is a multi-site CDN a good idea? I can't think of a downside, but have probably missed something - obviously they won't share the same folder, as I invisage the requests to be css.cdnsite.com/example.com/style.css or something along those lines. Having multiple sites in the same place will obviously make it easier for us to manage, as well as being cheaper, but then I wonder if it'll be worth it... Long story short: How should the CMS handle user uploaded media (separate installations) Just keep a local copy of all assets and serve them from the same site, like in days of yore? Keep a local copy, force site to use www. and have CDN subdomains per site? Or use a single separate CDN for all sites? Apologies for the length of this question, not sure if this should be multiple questions or not, as all parts are kind of related and could affect each other.

    Read the article

  • Arguments for discouraging satellite sites?

    - by Jjdelc
    I am working with a client who's read about satellite sites on a SEP book and has been building hundreds of keyword reach domains (buytoyotacorona1989cheap.com, brandnewsuvinred.com) with specific content about such domains. They all link to a main domain (CompanyName.com) where most of the information is either repeated(from other sites) or new. I told him to drop all the other domains and only focus on building good content for the main site as it is too difficult to maintain so many websites, plus they might look like link farms to Google. He told me to make a Google search for "Buy Toyota cheap " and two of his websites were listed among top 10. So it's seem to be doing some good, but I get the feeling that what he is doing is wrong. What other arguments are there to discourage this practice? or is he doing the right thing? My arguments have helped him to decide go down from hundreds to close to one hundred (because cost of maintainance) but I believe he should only have one or two sites. PS: The business is not actually about cars.

    Read the article

  • Enterprise 2.0 Conference: Building Social Business

    - by kellsey.ruppel
    The way we work is changing rapidly, offering an enormous competitive advantage to those who embrace the new tools that enable contextual, agile and simplified information exchange and collaboration to distributed workforces and networks of partners and customers. As many of you are aware, Enterprise 2.0 is the term for the technologies and business practices that liberate the workforce from the constraints of legacy communication and productivity tools like email. It provides business managers with access to the right information at the right time through a web of inter-connected applications, services and devices. Enterprise 2.0 makes accessible the collective intelligence of many, translating to a huge competitive advantage in the form of increased innovation, productivity and agility. The Enterprise 2.0 Conference takes a strategic perspective, emphasizing the bigger picture implications of the technology and the exploration of what is at stake for organizations trying to change not only tools, but also culture and process. Beyond discussion of the "why", there will also be in-depth opportunities for learning the "how" that will help you bring Enterprise 2.0 to your business.You won't want to miss this opportunity to learn and hear from leading experts in the fields of technology for business, collaboration, culture change and collective intelligence. Oracle is a proud Gold sponsor of the Enterprise 2.0 Conference, taking place this week in Boston. Come and learn about Oracle at the following panel sessions and Market Leaders Theater Sessions. Tuesday, June 19, 2012 at 1:30 p.m. Market Theater Presentation Into the Activity Stream, and Beyond! Introducing Oracle Social Network Oracle Speaker: Christian Finn, Senior Director of Evangelism, Oracle WebCenter Tuesday, June 19, 2012 at 2:30 p.m.  Panel Session Innovation versus Integration Oracle Panel Speaker: Christian Finn, Senior Director of Evangelism, Oracle WebCenter Wednesday, June 20, 2012 at 1:30 p.m. Business Leadership Roundtable Oracle Panel Speaker: Christian Finn, Senior Director of Evangelism, Oracle WebCenter Wednesday, June 20, 2012 at 3:00 p.m. Market Theater Presentation Into the Activity Stream, and Beyond! Introducing Oracle Social Network Oracle Speaker: Christian Finn, Senior Director of Evangelism, Oracle WebCenter Thursday, June 21, 2012 at 8:30 a.m. Panel Session Collecting and Processing Big Data: Architecting Systems that Scale Oracle Panel Speaker: Ashok Joshi, Senior Director, Berkeley DB Development Thursday, June 21, 2012 at 11:00 a.m. Panel Session The Future of Big Data: What's Next Oracle Panel Speaker: Ashok Joshi, Senior Director, Berkeley DB Development Be sure to stop by and visit Oracle booth #501, to see live demonstrations of Oracle Social Network and Oracle WebCenter!

    Read the article

  • Segmentation and Targeting: Your Tools for Personalizing the Online Customer Experience

    - by Christie Flanagan
    In order to deliver the kind of personalized and engaging online experiences that customers expect today, look to segmentation and targeting.  Segmentation is the practice of dividing your site visitors into distinct groups based on shared characteristics or behavior – for example, a segment may consist of site visitors who have visited pages related to certain product type, or they may consist of visitors within the same age group or geographic area.  The idea is that those within a segment are more likely to have common needs, problems or interests that can be served by your business. Targeting is the process by which the most relevant content, whether an article promotion or other piece of content, is delivered to your visitors based on their segment membership. Segmentation and targeting are used to drive greater engagement on your web presence by delivering content to your site visitors that is tailored to their interests, behavior or other attributes.  You may have a number of different goals for your segmentation and targeting efforts: Up-sell or cross-sell to your customers Conduct A/B testing on your offers and creative Offer discounts, promotions or other incentives for the time and duration that you specify Make is easier to find relevant information about products and services Create premium content model There are two different approaches you can take toward segmentation and targeting for you online customer experience initiatives. The first is more of a manual process, in which marketers manage the process of determining which segments to create and which content to target to those segments. The benefit of this approach is that it gives marketers a high level of control over the whole process which works well when you have a thorough understanding of your segments and which content is most likely to serve their needs.  Tools for marketer managed segmentation and targeting are often built right in to your WEM platform, as they are with Oracle WebCenter Sites. The downside is that the more segments and content that you have, the more time consuming and complicated in can be to manage manually.The second approach relies on predictive intelligence to automate the segmentation and targeting process.  This allows optimization of the process to occur in real time. This approach helps reduce the burden of manual segmentation and targeting and can result in new insights into segments that you may never have thought of on your own.  It also provides you with the capability to quickly test new offers and promotions on your site.  Predictive segmentation and targeting can be achieved by using Oracle WebCenter Sites and Oracle Real-Time Decisions together. *****Get a taste for how Oracle WebCenter Sites and Oracle Real-Time Decisions combine to deliver powerful capabilities for predictive segmentation and targeting by watching this on demand webcast introducing Oracle WebCenter Sites 11g or by reading IDC’s take on the latest release of Oracle’s web experience management solution.  Be sure to return to the Oracle WebCenter blog on Thursday for a closer look at how to optimize the online customer experience using these two products together.

    Read the article

  • Restrict only some plugins to specific sites in Google Chrome

    - by Christian
    I am looking for a way to set up Google Chrome so that it will run a certain plug-in (Java, what else?) only on whitelisted sites, but other plug-ins (like the PDF viewer) everywhere. From playing with the policies available for Chrome, I think there are basically two levels of plug-in management: List of disabled plugins/enabled plugins: Controls whether a plug-in exists for the browser at all This pair of policies applies to plug-ins, but not to sites. Default plug-in settings/Allow plug-ins on sites: Controls on which sites plug-ins can run This set of policies applies to sites, but not to individual plugins, and it cannot override the first pair. There appears to be no way to configure Chrome so that some plug-ins only run on whitelisted sites, but others run everywhere by default. I have also looked at filtering content on the firewall/proxy level, but I'm not convinced it can be done securely there. Filtering by URLs (file names) or content types can be circumvented trivially, and identification by content inspection cannot be safe either.

    Read the article

  • Explore Historic Sites from the Comfort of Your Desktop with Google’s ‘World Wonders Project’

    - by Asian Angel
    Have you always wanted to explore historic sites across the world but lack the extra time and/or funds to do so? Then take heart! Now you can visit historic sites to your heart’s content from home with Google’s ‘World Wonders Project’. Note: The screenshot shown above is from the ‘Archaeological Areas of Pompei’ site. You can explore exotic locations such as Pompei, the Palace and Park of Versailles, Shark Bay, the Tenryu-ji-Temple in Ancient Kyoto, and more. The World Wonders Project Homepage The World Wonders Project YouTube Channel HTG Explains: Learn How Websites Are Tracking You Online Here’s How to Download Windows 8 Release Preview Right Now HTG Explains: Why Linux Doesn’t Need Defragmenting

    Read the article

  • Joomla Sites hacked by DR-MTMRD [closed]

    - by RedLEON
    Possible Duplicate: My Sites Were Hacked. What To Do? A few of my joomla sites were hacked. After I became aware of this, I did these things: Changed hosting passwords (mysql, ftp, control panel) Renamed joomla admin user name to "admin" in users table (Hacker had changed the user name how?) Upgraded joomla latest Added php.ini root directory of host. Disabled cgi access But the site is still hacked. I checked up on the index.php file and owerwrite original index.php but the site is still hacked. How is this possible?

    Read the article

  • Permissions & File Structure w/ nginx & multiple sites

    - by Michael
    I am using nginx for the first time as a long time Apache user. I setup a Linode to test everything and to eventually port over my websites. Previously I had /home/user/www (wwwroot) I am looking at doing something similar with /srv/www/domain/www (wwwroot) Rather than using /srv/domain (wwwroot), the reason is many of the sites are WordPress and one of the things I do for security is to move the config file one level above wwwroot and can't have multiple configuration files from multiple domains in the same top level folder. Since I own all the sites, I wasn't going to create a user for each domain. My user is a member of www-data and was going to use 2770 for www and have domain/www for each new domain. www would be owned by group www-data. Is this the best way to handle this?

    Read the article

  • Best sites to find good .NET Developers

    - by Mag20
    I am looking for good sites to post a position for a .NET developer. I already tried: Craig's list got about 10 resumes, but most couldn't answer our technical questions StackOverflow Careers no responses What sites did you have success with finding good developers? UPDATE 1: Wanted to provide some more information: My company is in NJ. We are a small startup. Less then 10 people. Monster, Dice, CareerBuilder all charge like $500 a month per posting. Seems a bit much. Also only Dice is specifically targeting technical positions. With monster and career builder I am a bit worried about having to go through hundreds of resumes that don't apply.

    Read the article

  • Becoming a Certified Information Professional

    - by Lance Shaw
    Yesterday, we participated with AIIM in a webinar about the Certified Information Professional (CIP) program that they are now offering.  The interest level is very high in the program, as evidenced by the high turnout at the event. You might be asking yourself, what does the Oracle WebCenter team care about an AIIM certification program? Well, we sponsored this program because we consistently find that the more educated our customers and prospects are, the more value they are going to get out of the technology we provide.  As an ECM vendor, we provide plenty of WebCenter product training and certifications to help you make the most of WebCenter technology. While these are essential and valuable, technologists that also have an operational command of the business and the various impacts that the flow of information can have are even more valuable to an organization. Thinking about the management of content and information and its effect on business process can have wide-ranging benefits, not only to your company but to your personal bottom line.  And let's be honest, a customer who is looking holistically at how content is managed is going to see more opportunities to leverage that content and in many cases, this will motivate the purchase of additional product licenses.   Now if you are regretting the fact that you missed the webinar yesterday, never fear!  It is now available for playback and you can view it at your convenience by visiting the AIIM website. We hope you find it informative and that you can personally profit from being able to showcase your certification as an Information Professional. Additionally, we hope it will help you identify additional opportunities to leverage Oracle WebCenter in order to further reduce your operational costs and drive your business forward.

    Read the article

  • Services - Separate Sites or One Site - Impact on SEO

    - by Lynda
    I have a client who is a lawyer that specializes in Criminal Defense and DUI, however, he does not show up well in Google. In researching the sites that rank better have much more content for those specialties than his site does and my thought it that he needs to add more quality content to rank better for those searches. On his site he mentions his specialties, but also he has various personal things on his sites that reflect his interest. These are clearly separated from the business portion. My questions are should he 1) separate his personal information into a new domain and 2) should he have a separate URL for each of his specialties? OR would one URL work as long as everything is clearly separated? I read once that for legal services to rank well you should make a separate site for each specialty and have that site focus solely on that service.

    Read the article

  • Why subdomains of Blogspot/WordPress like sites are treated as different domains or sites?

    - by Thedijje
    As I know, maps.google.com or mail.google.com all comes under the same domain and its all are subdomain. Entire web treats these subdomain as the part of main domain and they have same Alexa rank, PageRank and all. But in another hand, take a look on blogspot.com/wordpress.com/webs.com; these are different sites but blogs or websites under those domains are treated as different sites. Its new URL, all have different PageRank and Alexa rank as well. Tts about millions of subdomains under those few domain, have almost similar IP address, hosting and CMS, still why they are called different domains?

    Read the article

  • Yahoo annonce un accord avec Facebook, les deux sites renforcent leurs liens et deviennent partenair

    Yahoo annonce un accord avec Facebook, les deux sites renforcent leurs liens et deviennent partenaires Yahoo et Facebook viennent d'annoncer un rapprochement entre leurs deux sites. En effet, leurs services sont à présent liés sous certains aspects. Un internaute possédant un compte sur les deux plateformes, pourra ainsi voir des notifications relatives à ses activités sur un site, apparaitre sur l'autre. De plus, les utilisateurs de certains services proposés par Yahoo (comme Flickr ou Yahoo Answer), pourront partager plus facilement leurs données y étant hébergées avec leurs amis Facebook. Il leur sera aussi possible de mettre à jour à la fois leur profil Yahoo ainsi que leur profil Facebook en une ...

    Read the article

  • HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS

    - by Jason Faulkner
    Even if you’ve only loosely followed the events of the hacker groups Anonymous and LulzSec, you’ve probably heard about web sites and services being hacked, like the infamous Sony hacks. Have you ever wondered how they do it? There are a number of tools and techniques that these groups use, and while we’re not trying to give you a manual to do this yourself, it’s useful to understand what’s going on. Two of the attacks you consistently hear about them using are “(Distributed) Denial of Service” (DDoS) and “SQL Injections” (SQLI). Here’s how they work. Image by xkcd HTG Explains: How Hackers Take Over Web Sites with SQL Injection / DDoS Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed How to Run Android Apps on Your Desktop the Easy Way

    Read the article

  • Today's Well Connected Companies

    - by Michael Snow
    Statoil Fuel & Retail and their partner, L&T Infotech, our recent winner of the Oracle Excellence Award for Fusion Middleware Innovation in the WebCenter category is featured this month in Profit Magazine's November Issues of both print and online versions. The online version has significantly more detail about their "Connect" project Statoil Fuel & Retail is a leading Scandinavian road transport fuel retailer that operates in 8 different countries and delivers aviation fuel at 85 airports. The company produces and sells 750 different lubricant products for B2B and B2C customers. Statoil won the 2013 Oracle Excellence Award for Oracle Fusion Middleware Innovation: Oracle WebCenter based on a stellar Oracle implementation, created with implementation partner L&T Infotech, which used Oracle’s JD Edwards and Oracle Fusion Middleware to replace and consolidate 10 SAP portals into a single, integrated, personalized enterprise portal for partners, station managers, and support staff. Utilizing Oracle WebCenter Portal, Oracle WebCenter Content, Oracle Identity Management, Oracle SOA Suite, JD Edwards applications, and Oracle CRM On Demand, Statoil is now able to offer a completely redesigned portal for an easy and user-friendly web experience, delivering a fast, secure, robust, and scalable solution that will help the company remain competitive in its industry. The solution has increased Statoil Fuel & Retail’s web footprint and expanded its online business. Read the complete article for the full story of Statoil Fuel & Retail's implementation of Oracle Fusion Middleware technology.

    Read the article

  • You Can Deliver an Engaging Online Experience Across All Phases of the Customer Journey

    - by Christie Flanagan
    Engage. Empower. Optimize. Today’s customers have higher expectations and more choices than ever before.  To succeed in this environment, organizations must deliver an engaging online experience that is personalized, interactive and consistent across all phases of the customer journey. This requires a new approach that connects and optimizes all customer touch points as they research, select and transact with your brand.  Oracle WebCenter Sites combines with other customer experience applications such as Oracle ATG Commerce, Oracle Endeca, Oracle Real-Time Decisions and Siebel CRM to deliver a connected customer experience across your websites and campaigns. Attend this Webcast to learn how Oracle WebCenter: Works with Oracle ATG Commerce and Oracle Endeca to deliver consistent and engaging browsing, shopping and search experiences across all of your customer facing websites Enables you to optimize the performance of your online initiatives through integration with Oracle Real-Time Decisions for automated targeting and segmentation Connects with Siebel CRM to maintain a single view of the customer and integrate campaigns across channels Register now for the Webcast.

    Read the article

  • Sites with overlapping code-bases. Developing multiple sites with little changes

    - by Web Developer
    I have to develop 3 different sites video.com for hosting video audio.com for hosting audio docs.com for hosting docs. domain names for example only Almost 80% of the functionality is the same for all the three, with remaining 20% being completely different features... How do I handle this? How does sites like SO handle this? I am developing this in YII framework and was thinking of having these different features as modules but in this case the menu/code links in html code can become difficult.

    Read the article

  • 1000 most visited sites on the web: A Google Analysis

    Google has released an analysis on the 1000 most visited sites on the web. Considering that we own/operate 3 of the top 10 sites and has a significant interest in Facebook, plus this recent report that states that Microsoft employees are the most social-media-savvy will go to great lengths to show how well we can operate in our cloud and social media integration and collaboration strategies. William Tay 2000-2010 | Swinging Technologist http://www.softwaremaker.net/blog...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Q&A: Drive Online Engagement with Intuitive Portals and Websites

    - by kellsey.ruppel
    We had a great webcast yesterday and wanted to recap the questions that were asked throughout. Can ECM distribute contents to 3rd party sites?ECM, which is now called WebCenter Content can distribute content to 3rd party sites via several means as well as SSXA - Site Studio for External Applications. Will you be able to provide more information on these means and SSXA?If you have an existing JSP application, you can add the SSXA libraries to your IDE where your application was built (JDeveloper for example).  You can now drop some code into your 3rd party site/application that can both create and pull dynamically contributable content out of the Content Server for inclusion in your pages.   If the 3rd party site is not a JSP application, there is also the option of leveraging two Site Studio (not SSXA) specific custom WebCenter Content services to pull Site Studio XML content into a page. More information on SSXA can be found here: http://docs.oracle.com/cd/E17904_01/doc.1111/e13650/toc.htm Is there another way than a ”gadget” to integrate applications (like loan simulator) in WebCenter Sites?There are some other ways such as leveraging the Pagelet Producer, which is a core component of WebCenter Portal. Oracle WebCenter Portal's Pagelet Producer (previously known as Oracle WebCenter Ensemble) provides a collection of useful tools and features that facilitate dynamic pagelet development. A pagelet is a reusable user interface component. Any HTML fragment can be a pagelet, but pagelet developers can also write pagelets that are parameterized and configurable, to dynamically interact with other pagelets, and respond to user input. Pagelets are similar to portlets, but while portlets were designed specifically for portals, pagelets can be run on any web page, including within a portal or other web application. Pagelets can be used to expose platform-specific portlets in other web environments. More on Page Producer can be found here:http://docs.oracle.com/cd/E23943_01/webcenter.1111/e10148/jpsdg_pagelet.htm#CHDIAEHG Can you describe the mechanism available to achieve the context transfer of content?The primary goal of context transfer is to provide a uniform experience to customers as they transition from one channel to another, for instance in the use-case discussed in the webcast, it was around a customer moving from the .com marketing website to the self-service site where the customer wants to manage his account information. However if WebCenter Sites was able to identify and segment the customers  to a specific category where the customer is a potential target for some promotions, the same promotions should be targeted to the customer when he is in the self-service site, which is managed by WebCenter Portal. The context transfer can be achieved by calling out the WebCenter Sites Engage Server API’s, which will identify the segment that the customer has been bucketed into. Again through REST API’s., WebCenter Portal can then request WebCenter Sites for specific content that needs to be targeted for a customer for the identified segment. While this integration can be achieved through custom integration today, Oracle is looking into productizing this integration in future releases.  How can context be transferred from WebCenter Sites (marketing site) to WebCenter Portal (Online services)?WebCenter Portal Personalization server can call into WebCenter Sites Engage Server to identify the segment for the user and then through REST API’s request specific content that needs to be surfaced in the Portal. Still have questions? Leave them in the comments section! And you can catch a replay of the webcast here.

    Read the article

  • Can't connect to certain HTTPS sites

    - by mind.blank
    I've just moved to a new apartment and with internet connection via a router and I'm finding that I can't connect to quite a few sites that use SSL. For example trying to connect to PayPal: curl -v https://paypal.com * About to connect() to paypal.com port 443 (#0) * Trying 66.211.169.3... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * Unknown SSL protocol error in connection to paypal.com:443 * Closing connection #0 curl: (35) Unknown SSL protocol error in connection to paypal.com:443 curl -v -ssl https://paypal.com gives the same output. For some sites it works: curl -v https://www.google.com * About to connect() to www.google.com port 443 (#0) * Trying 74.125.235.112... connected * successfully set certificate verify locations: * CAfile: none CApath: /etc/ssl/certs * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS handshake, Server key exchange (12): * SSLv3, TLS handshake, Server finished (14): * SSLv3, TLS handshake, Client key exchange (16): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSL connection using ECDHE-RSA-RC4-SHA * Server certificate: * subject: C=US; ST=California; L=Mountain View; O=Google Inc; CN=www.google.com * start date: 2011-10-26 00:00:00 GMT * expire date: 2013-09-30 23:59:59 GMT * common name: www.google.com (matched) * issuer: C=ZA; O=Thawte Consulting (Pty) Ltd.; CN=Thawte SGC CA * SSL certificate verify ok. > GET / HTTP/1.1 > User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3 > Host: www.google.com > Accept: */* > < HTTP/1.1 302 Found < Location: https://www.google.co.jp/ . . . I'm using Ubuntu 12.04, with Windows 7 installed as well. These sites work on Windows :( Not sure if this information helps but I ran ifconfig and got the following: eth0 Link encap:Ethernet HWaddr 1c:c1:de:bc:e2:4f inet6 addr: 2408:c3:7fff:991:686b:8d18:81b3:8dd1/64 Scope:Global inet6 addr: 2408:c3:7fff:991:1ec1:deff:febc:e24f/64 Scope:Global inet6 addr: fe80::1ec1:deff:febc:e24f/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:87075 errors:0 dropped:0 overruns:0 frame:0 TX packets:54522 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:78167937 (78.1 MB) TX bytes:10016891 (10.0 MB) Interrupt:46 Base address:0x4000 eth1 Link encap:Ethernet HWaddr ac:81:12:0d:93:80 inet6 addr: fe80::ae81:12ff:fe0d:9380/64 Scope:Link UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:498 TX packets:0 errors:26 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:17 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:630 errors:0 dropped:0 overruns:0 frame:0 TX packets:630 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:39592 (39.5 KB) TX bytes:39592 (39.5 KB) ppp0 Link encap:Point-to-Point Protocol inet addr:180.57.228.200 P-t-P:118.23.8.175 Mask:255.255.255.255 UP POINTOPOINT RUNNING NOARP MULTICAST MTU:1492 Metric:1 RX packets:39631 errors:0 dropped:0 overruns:0 frame:0 TX packets:22391 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:3 RX bytes:43462054 (43.4 MB) TX bytes:2834628 (2.8 MB)

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >