Search Results

Search found 14041 results on 562 pages for 'home theater'.

Page 120/562 | < Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >

  • Python function correctly/incorrectly?

    - by Anthony Kernan
    I'm just starting too use python, learning experience. I know the basics logic of programming. I have a function in python that is running everytime, even when it's not supposed to. I use an if statement in the beginning of the function. I don't know why this if statement is not working, confused. I have another function that is similar and works correctly. Am I missing something simple? Here's the function that is not working... def check_artist_art(): if os.path.exists("/tmp/artistinfo") and open("/tmp/artistinfo").read() != title: #if artist == "": if os.path.exists(home + "/.artist"): os.remove(home + "/.artist") if os.path.exists("/tmp/artistinfo"): os.remove("/tmp/artistinfo") print artist return False else: os.path.exists("/tmp/artistinfo") and open("/tmp/artistinfo").read() == artist return False return True And this is the similar function that is working correctly.. def check_album(): if os.path.exists("/tmp/albuminfo") and open("/tmp/albuminfo").read() != album: if os.path.exists(home + "/.album"): os.remove(home + "/.album") if os.path.exists("/tmp/albuminfo"): os.remove("/tmp/albuminfo") return False elif os.path.exists("/tmp/trackinfo") and open("/tmp/trackinfo").read() == artist + album: return False return True Any help is greatly appreciated.

    Read the article

  • Escape whitespace in paths using nautilus script

    - by Tommy Brunn
    I didn't think this would be as tricky as it turned out to be, but here I am. I'm trying to write a Nautilus script in Python to upload one or more images to Imgur just by selecting and right clicking them. It works well enough with both single images and multiple images - as long as they don't contain any whitespace. In fact, you can upload a single image containing whitespace, just not multiple ones. The problem is that NAUTILUS_SCRIPT_SELECTED_FILE_PATHS returns all the selected files and directories as a space separated string. So for example, it could look like this: print os.environment['NAUTILUS_SCRIPT_SELECTED_FILE_PATHS'] /home/nevon/Desktop/test image.png /home/nevon/Desktop/test.jpg What I need is a way to -either in bash or Python- escape the spaces in the path - but not the spaces that delimit different items. Either that, or a way to put quotation marks around each item. The ultimate solution would be if I could do that in bash and then send the items as separate arguments to my python script. Something like: python uploader.py /home/nevon/Desktop/test\ image.png /home/nevon/Desktop/test.jpg I've tried RTFM'ing, but there doesn't seem to be a lot of good solutions for this. At least not that I've found. Any ideas?

    Read the article

  • Rake db:migrate returns "rake aborted! no such file to load -- spec"

    - by Isaac Yerushalmi
    For some reason, out of no where, rails began giving me an error on "rake db:migrate", and I can no longer run migrations. It returns the error "no such file to load -- spec /home/ti/rails_apps/appname/Rakefile:10" I've spent two hours searching google for answers, trying to figure this out, but to no avail. What could be the problem? Here is the trace: -jailshell-3.2$ rake db:migrate --trace (in /home/ti/rails_apps/teamisrael) rake aborted! no such file to load -- spec /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require' /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `require' /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:156:in `require' /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:521:in `new_constants_in' /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:156:in `require' /home/ti/rails_apps/teamisrael/vendor/plugins/google-geocoder/tasks/rspec.rake:5 /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:145:in `load_without_new_constant_marking' /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:145:in `load' /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:521:in `new_constants_in' /usr/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/dependencies.rb:145:in `load' /usr/lib/ruby/gems/1.8/gems/rails-2.3.5/lib/tasks/rails.rb:7 /usr/lib/ruby/gems/1.8/gems/rails-2.3.5/lib/tasks/rails.rb:7:in `each' /usr/lib/ruby/gems/1.8/gems/rails-2.3.5/lib/tasks/rails.rb:7 /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `gem_original_require' /usr/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:31:in `require' /home/ti/rails_apps/teamisrael/Rakefile:10 /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:2349:in `load' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:2349:in `raw_load_rakefile' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:1985:in `load_rakefile' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:2036:in `standard_exception_handling' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:1984:in `load_rakefile' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:1969:in `run' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:2036:in `standard_exception_handling' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/lib/rake.rb:1967:in `run' /usr/lib/ruby/gems/1.8/gems/rake-0.8.3/bin/rake:31 /usr/local/bin/rake:19:in `load' /usr/local/bin/rake:19

    Read the article

  • routes as explained in RoR tutorial 2nd Ed?

    - by 7stud
    The author, Michael Hartl, says: Here the rule: get "static_pages/home" maps requests for the URI /static_pages/home to the home action in the StaticPages controller. How? The type of request is given, the url is given, but where is the mapping to a controller and action? My tests all pass, though. I also tried deleting all the actions in the StaticPagesController, which just looks like this: class StaticPagesController < ApplicationController def home end def about end def help end def contact end end ...and my tests still pass, which is puzzling. The 2nd edition of the book(online) is really frustrating. Specifically, the section about making changes to the Guardfile is impossible to follow. For instance, if I instruct you to edit this file: blah blah blah dog dog dog beetle beetle beetle jump jump jump and make these changes: blah blah blah . . . go go go . . . jump jump jump ...would you have any idea where the line 'go go go' should be in the code? And the hint for exercise 3.5-1 is flat out wrong. If the author would put up a comment section at the end of every chapter, the rails community could self-edit the book.

    Read the article

  • What's causing this permission's error and how can I work around it?

    - by Scott B
    Warning: move_uploaded_file(/home/site/public_html/wp-content/themes/mytheme/upgrader.zip) [function.move-uploaded-file]: failed to open stream: Permission denied in /home/site/public_html/wp-content/themes/mytheme/uploader.php on line 79 Warning: move_uploaded_file() [function.move-uploaded-file]: Unable to move '/tmp/phptempfile' to '/home/site/public_html/wp-content/themes/mytheme/upgrader.zip' in /home/site/public_html/wp-content/themes/mytheme/uploader.php on line 79 There was a problem. Sorry! Code is below for that line... // permission settings for newly created folders $chmod = 0755; // Ensures that the correct file was chosen $accepted_types = array('application/zip', 'application/x-zip-compressed', 'multipart/x-zip', 'application/s-compressed'); foreach($accepted_types as $mime_type) { if($mime_type == $type) { $okay = true; break; } } $okay = strtolower($name[1]) == 'zip' ? true: false; if(!$okay) { die("This upgrader requires a zip file. Please make sure your file is a valid zip file with a .zip extension"); } //mkdir($target); $saved_file_location = $target . $filename; //Next line is 79 if(move_uploaded_file($source, $saved_file_location)) { openZip($saved_file_location); } else { die("There was a problem. Sorry!"); }

    Read the article

  • method used like a type error in a unit test

    - by Josepth Vodary
    I am trying to unit test a simple factory - but it keeps telling me that I am trying to use a method like a type? My unit test using System; using System.Text; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Home; namespace HomeTest { [TestClass] public class TestFactory { [TestMethod] public void DoTestFactory() { InventoryType.InventorySelect select = new InventoryType.InventorySelect(); select.inventoryTypes.Add("cds"); Home.Services.Factory.CreateInventory get = new Home.Services.Factory.CreateInventory(); get.InventoryImpl(); if (select.Validate() == true) Console.WriteLine("Test Passed"); else if (select.Validate() == false) Console.WriteLine("Test Returned False"); else Console.WriteLine("Test Failed To Run"); Console.ReadLine(); } } } My facotry using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace Home.Services { public class Factory { public InventorySvc CreateInventory() { return new InventoryImpl(); } } }

    Read the article

  • Simple problem with mod_rewrite in the Fat Free Framework

    - by ian
    I am trying to setup and learn the Fat Free Framework for PHP. http://fatfree.sourceforge.net/ It's is fairly simple to setup and I am running it on my machine using MAMP. I was able to get the 'hello world' example running just fin: require_once 'path/to/F3.php'; F3::route('GET /','home'); function home() { echo 'Hello, world!'; } F3::run(); But when I try to add in the second part, which has two routes: require_once 'F3/F3.php'; F3::route('GET /','home'); function home() { echo 'Hello, world!'; } F3::route('GET /about','about'); function about() { echo 'About Us.'; } F3::run(); I get a 404 error if I try the second URL: /about Not sure why one of the mod_rewrite commands would be working and not the other. Below is my .htaccess file: # Enable rewrite engine and route requests to framework RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-l RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule .* index.php [L,QSA] # Disable ETags Header Unset ETag FileETag none # Default expires header if none specified (stay in browser cache for 7 days) <IfModule mod_expires.c> ExpiresActive On ExpiresDefault A604800 </IfModule>

    Read the article

  • Redirect requests only if the file is not found?

    - by ZenBlender
    I'm hoping there is a way to do this with mod_rewrite and Apache, but maybe there is another way to consider too. On my site, I have directories set up for re-skinned versions of the site for clients. If the web root is /home/blah/www, a client directory would be /home/blah/www/clients/abc. When you access the client directory via a web browser, I want it to use any requested files in the client directory if they exist. Otherwise, I want it to use the file in the web root. For example, let's say the client does not need their own index.html. Therefore, some code would determine that there is no index.html in /home/blah/www/clients/abc and will instead use the one in /home/blah/www. Keep in mind that I don't want to redirect the client to the web root at any time, I just want to use the web root's file with that name if the client directory has not specified its own copy. The web browser should still point to /clients/abc whether the file exists there or in the root. Likewise, if there is a request for news.html in the client directory and it DOES exist there, then just serve that file instead of the web root's news.html. The user's experience should be seamless. I need this to work for requests on any filename. If I need to, for example, add a new line to .htaccess for every file I might want to redirect, it rather defeats the purpose as there is too much maintenance needed, and a good chance for errors given the large number of files. In your examples, please indicate whether your code goes in the .htaccess file in the client directory, or the web root. Web root is preferred. Thanks for any suggestions! :)

    Read the article

  • Jquery Modal Popup opens twice on Single Click with ASP.Net MVC3

    - by user1704379
    I am using Modal Popup in my MVC3 application it works fine but opens twice for a single Click on the link. The Modal pop is triggered from the 'Index' view of my Home Controller. I am calling a view 'PopUp.cshtml' in my modal popup. The related ActionMethod 'PopUp' for the respective view is in my 'Home' controller. Here is the code, Jquery code on layout.cshtml page, <script type="text/javascript"> $.ajaxSetup({ cache: false }); $(document).ready(function () { $(".openPopup").live("click", function (e) { e.preventDefault(); $("<div></div><p>") .attr("id", $(this).attr("data-dialog-id")) .appendTo("body") .dialog({ autoOpen: true, title: $(this).attr("data-dialog-title"), modal: true, height: 250, width: 900, left: 0, buttons: { "Close": function () { $(this).dialog("close"); } } }) .load(this.href); }); $(".close").live("click", function (e) { e.preventDefault(); $(this).dialog("close"); }); }); </script> cshtml code in 'PopUp.cshtml' @{ ViewBag.Title = "PopUp"; Layout = null; } <h2>PopUp</h2> <p> Hello this is a Modal Pop-Up </p> Call modal popup code in Index view of Home Controller, <p> @Html.ActionLink("Click here to open modal popup", "Popup", "Home",null, new { @class = "openPopup", data_dialog_id = "popuplDialog", data_dialog_title = "PopUp" }) </p> What am I doing wrong that the modal pop up opens twice ? Thanks in Advance !

    Read the article

  • rc.local on ubuntu on ec2 will not work

    - by Tampa
    Below are the contents of my rc.local file. When I run sudo /etc/rc.local it works fine. When I boot up and instance. I expect monit to be installed but it is not. I am at a total loss. I usually use rc.local but this is rather confunsing. #!/bin/sh -e # # rc.local # # This script is executed at the end of each multiuser runlevel. # Make sure that the script will "exit 0" on success or any other # value on error. # # In order to enable or disable this script just change the execution # bits. # # By default this script does nothing. apt-get -y install monit /etc/init.d/monit stop cd /home/ubuntu/workspace/rtbopsConfig/ git fetch git checkout origin/master rtb_ec2_boot/ec2_boot.py git checkout origin/master config/ cp /home/ubuntu/workspace/rtbopsConfig/config/monit/redis/monitrc /etc/monit/ /usr/bin/python /home/ubuntu/workspace/rtbopsConfig/rtb_ec2_boot/ec2_boot.py >> /home/ubuntu/workspace/ec2_boot.txt 2>&1 /etc/init.d/monit start chkconfig monit on exit 0

    Read the article

  • Rails - Dynamic name routes namespace

    - by Kuro
    Hi, Using Rails 2.3, I'm trying to configure my routes but I uncounter some difficulties. I would like to have something like : http:// mydomain.com/mycontroller/myaction/myid That should respond with controllers in :front namespace http:// mydomain.com/aname/mycontroller/myaction/mydi That should respond with controllers in :custom namespace I try something like this, but I'm totaly wrong : map.namespace :front, :path_prefix => "" do |f| f.root :controller => :home, :action => :index f.resources :home ... end map.namespace :custom, :path_prefix => "" do |m| m.root :controller => :home, :action => :index m.resources :home ... m.match ':sub_url/site/:controller/:action/:id' m.match ':sub_url/site/:controller/:action/:id' m.match ':sub_url/site/:controller/:action/:id.:format' m.match ':sub_url/site/:controller/:action.:format' end I put matching instruction in custom namespace but I'm not sure it's the right place for it. I think I really don't get the way to customize fields in url matching and I don't know how to find documentation about Rails 2.3, most of my research drove me to Rails 3 doc about the topic... Somebody to help me ?

    Read the article

  • Looping through with dates

    - by Luke
    I have created a fixture generator for football/ soccer games... $totalRounds = $teams - 1; $matchesPerRound = $teams / 2; $rounds = array(); $roundDates = array(); $curTime = time(); for ($i = 1; $i <= $totalRounds; $i++) { $rounds[$i] = array(); $numDays = $i * 4; $roundDates[$i] = strtotime("+".$numDays." days",$curTime); } foreach($roundDates as $time) { for ($round = 0; $round < $totalRounds; $round++) { for ($match = 0; $match < $matchesPerRound; $match++) { $home = ($round + $match) % ($teams - 1); $away = ($teams - 1 - $match + $round) % ($teams - 1); // Last team stays in the same place while the others // rotate around it. if ($match == 0) { $away = $teams - 1; } $rounds[$round][$match] = "$user[$home]~$team[$home]@$user[$away]~$team[$away]~$time"; } } } In the code, for $i = 1, i thought that if you want the first date 4 days from now, the i must be 1, so 1 * 4 = 4. If i was 0, 0 * 4 equals 0. I assume this is correct thinking? Anyway the main question is, trying to generate the dates isn't working. When i created a fixture list for 4 users home and away, I got 12 fixtures. 10 of these had the same date on them, and the other 2 didnt have a date. Can anyone help me with this? Thanks

    Read the article

  • Convert JSON into array dataType

    - by Myhome Stories
    I have the following JSON string var json = {"result":[{"address":" Ardenham Court, Oxford Road ,AYLESBURY, BUCKINGHAMSHIRE ,UNITED KINGDOM","picture":"1.jpg","uniqueid":"8b54275a60088547d473d462763b4738","story":"I love my home. I feel safe, I am comfortable and I am loved. A home can't be a home without our parents and our loved ones. But sad to say, some are experiencing that eventhough their loved ones are in their houses, they are not loving each other. There is a big war. You can't call it a home."}]} I want to get address ,picture,story separately for accomplish this. I tried recent answers in stackoverflow, but I was not able to achieve it. Below is what I have tried, $.each(json.result.address, function (index, value) { // Get the items var items = this.address; // Here 'this' points to a 'group' in 'groups' // Iterate through items. $.each(items, function () { console.log(this.text); // Here 'this' points to an 'item' in 'items' }); });

    Read the article

  • Week in Geek: LastPass Rescues Xmarks Edition

    - by Asian Angel
    This week we learned how to breathe new life into an aging Windows Mobile 6.x device, use filters in Photoshop, backup and move VirtualBox machines, use the BitDefender Rescue CD to clean an infected PC, and had fun setting up a pirates theme on our computers. Photo by _nash. Weekly Feature Do you love using the Faenza icon set on your Ubuntu system but feel that there are a few much needed icons missing (or you desire a different version of a particular icon)? Then you may want to take a look at the Faenza Variants icon pack. The icons are available in the following sizes: 16px, 22px, 32px, 48px and scalable sizes. Photo by Asian Angel. Faenza Variants Random Geek Links Another week with extra link goodness to help keep you on top of the news. Photo by Asian Angel. LastPass acquires Xmarks, premium service announced Xmarks announced that it has been acquired by LastPass, a cross-platform password management service. This also means that Xmarks is now in transition from a “free” to a “freemium” business model. WikiLeaks reappears on European Net domains WikiLeaks has re-emerged on a Swiss Internet domain followed by domains in Germany, Finland, and the Netherlands, sidestepping a move that had in effect taken the controversial site off the Internet. Iran: Yes, Stuxnet hurt our nuclear program The Stuxnet worm got some big play from Iranian President Mahmoud Ahmadinejad, who acknowledged that the malware dinged his nuclear program. More Windows Rogues than Just AV – Fake Defragmenter Check Disk Don’t think for a second that rogues are limited to scareware, because as so-called products such as “System Defragmenter”, “Scan Disk” “Check Disk” prove, they’re not. Internet Explorer’s Protected Mode can be bypassed Researchers from Verizon Business have now described a way of bypassing Protected Mode in IE 7 and 8 in order to gain access to user accounts. Can you really see who viewed your Facebook profile? Rogue application spreads virally Once again, a rogue application is spreading virally between Facebook users pretending to offer you a way of seeing who has viewed your profile. More holes in Palm’s WebOS Researchers Orlando Barrera and Daniel Herrera, who both work for security firm SecTheory, have discovered a gaping security hole in Palm’s WebOS smartphone operating system. Next-gen banking Trojans hit APAC With the proliferation of banking Trojans, Web and smartphone users of online banking services have to be on constant alert to avoid falling prey to fraud schemes, warned Etay Maor, project manager for RSA Fraud Action. AVG update cripples 64-bit computers A signature update automatically deployed by the AVG virus scanner Thursday has crippled numerous computers. Article includes link to forums to fix computers affected after a restart. Congress moves to outlaw ‘mystery charges’ for Web shoppers Legislation that makes it illegal for Web merchants and so-called post-transaction marketers to charge credit cards without the card owners’ say-so came closer to becoming law this week. Ballmer Set to “Look Into” Windows Home Server Drive Extender Fiasco Tuesday’s announcement from Microsoft regarding the removal of Drive Extender from Windows Home Server has sent shock waves across the web. Google tweaks search recipe to ding scam artists Google has changed its search algorithm to penalize sites deemed to provide an “extremely poor user experience” following a New York Times story on a merchant who justified abusive behavior towards customers as a search-engine optimization tactic. Geek Video of the Week Watch as our two friends debate back and forth about the early adoption of new technology through multiple time periods (Stone Age to the far future). Will our reluctant friend finally succumb to the temptation? Photo by CollegeHumor. Early Adopters Through History Random TinyHacker Links Fix Issues in Windows 7 Using Reliability Monitor Learn how to analyze Windows 7 errors and then fix them using the built-in reliability monitor. Learn About IE Tab Groups Tab groups is a useful feature in IE 8. Here’s a detailed guide to what it is all about. Google’s Book Helps You Learn About Browsers and Web A cool new online book by the Google Chrome team on browsers and the web. TrustPort Internet Security 2011 – Good Security from a Less Known Provider TrustPort is not exactly a well-known provider of security solutions. At least not in the consumer space. This review tests in detail their latest offering. How the World is Using Cell phones An infographic showing the shocking demographics of cell phone use. Super User Questions See the great answers to these questions from Super User. I am unable to access my C drive. It says it is unable to display current owner. List of Windows special directories/shortcuts like ‘%TEMP%’ Is using multiple passes for wiping a disk really necessary? How can I view two files side by side in Notepad++ Is there any tool that automatically puts screenshots to my Dropbox? How-To Geek Weekly Article Recap Look through our hottest articles from this past week at How-To Geek. How to Create a Software RAID Array in Windows 7 9 Alternatives for Windows Home Server’s Drive Extender Why Doesn’t Disk Cleanup Delete Everything from the Temp Folder? Ask the Readers: How Much Do You Customize Your Operating System? How to Upload Really Large Files to SkyDrive, Dropbox, or Email One Year Ago on How-To Geek Enjoy reading through these awesome articles from one year ago. How To Upgrade from Vista to Windows 7 Home Premium Edition How To Fix No Aero Transparency in Windows 7 Troubleshoot Startup Problems with Startup Repair Tool in Windows 7 & Vista Rename the Guest Account in Windows 7 for Enhanced Security Disable Error Reporting in XP, Vista, and Windows 7 The Geek Note That wraps things up here for this week. Regardless of the weather wherever you may be, we hope that you have an opportunity to get outside and have some fun! Remember to keep sending those great tips in to us at [email protected]. Photo by Tony the Misfit. Latest Features How-To Geek ETC The How-To Geek Guide to Learning Photoshop, Part 8: Filters Get the Complete Android Guide eBook for Only 99 Cents [Update: Expired] Improve Digital Photography by Calibrating Your Monitor The How-To Geek Guide to Learning Photoshop, Part 7: Design and Typography How to Choose What to Back Up on Your Linux Home Server How To Harmonize Your Dual-Boot Setup for Windows and Ubuntu Hang in There Scrat! – Ice Age Wallpaper How Do You Know When You’ve Passed Geek and Headed to Nerd? On The Tip – A Lamborghini Theme for Chrome and Iron What if Wile E. Coyote and the Road Runner were Human? [Video] Peaceful Winter Cabin Wallpaper Store Tabs for Later Viewing in Opera with Tab Vault

    Read the article

  • Sync Your Pidgin Profile Across Multiple PCs with Dropbox

    - by Matthew Guay
    Pidgin is definitely our favorite universal chat client, but adding all of your chat accounts to multiple computers can be frustrating.  Here’s how you can easily transfer your Pidgin settings to other computers and keep them in sync using Dropbox. Getting Started Make sure you have both Pidgin and Dropbox installed on any computers you want to sync.  To sync Pidgin, you need to: Move your Pidgin profile folder on your first computer to Dropbox Create a symbolic link from the new folder in Dropbox to your old profile location Delete the default pidgin profile on your other computer, and create a symbolic link from your Dropbox Pidgin profile to the default Pidgin profile location This sounds difficult, but it’s actually easy if you follow these steps.  Here we already had all of our accounts setup in Pidgin in Windows 7, and then synced this profile with an Ubuntu and a XP computer with fresh Pidgin installs.  Our instructions for each OS are based on this, but just swap the sync order if your main Pidgin install is in XP or Ubuntu. Please Note:  Please make sure Pidgin isn’t running on your computer while you are making the changes! Sync Your Pidgin Profile from Windows 7 Here is Pidgin with our accounts already setup.  Our Pidgin profile has a Gtalk, MSN Messenger, and Facebook Chat account, and lots of log files. Let’s move this profile to Dropbox to keep it synced.  Exit Pidgin, and then enter %appdata% in the address bar in Explorer or press Win+R and enter %appdata%.  Select the .purple folder, which is your Pidgin profiles and settings folder, and press Ctrl+X to cut it. Browse to your Dropbox folder, and press Ctrl+V to paste the .purple folder there. Now we need to create the symbolic link.  Enter  “command” in your Start menu search, right-click on the Command Prompt shortcut, and select “Run as administrator”. We can now use the mklink command to create a symbolic link to the .purple folder.  In Command Prompt, enter the following and substitute username for your own username. mklink /D “C:\Users\username\Documents\My Dropbox\.purple” “C:\Users\username\AppData\Roaming\.purple” And that’s it!  You can open Pidgin now to make sure it still works as before, with your files being synced with Dropbox. Please Note:  These instructions work the same for Windows Vista.  Also, if you are syncing settings from another computer to Windows 7, then delete the .purple folder instead of cutting and pasting it, and reverse the order of the file paths when creating the symbolic link. Add your Pidgin Profile to Ubuntu Our Ubuntu computer had a clean install of Pidgin, so we didn’t need any of the information in its settings.  If you’ve run Pidgin, even without creating an account, you will need to first remove its settings folder.  Open your home folder, and click View, and then “Show Hidden Files” to see your settings folders. Select the .purple folder, and delete it. Now, to create the symbolic link, open Terminal and enter the following, substituting username for your username: ln –s /home/username/Dropbox/.purple /home/username/ Open Pidgin, and you will see all of your accounts that were on your other computer.  No usernames or passwords needed; everything is setup and ready to go.  Even your status is synced; we had our status set to Away in Windows 7, and it automatically came up the same in Ubuntu. Please Note: If your primary Pidgin account is in Ubuntu, then cut your .purple folder and paste it into your Dropbox folder instead.  Then, when creating the symbolic link, reverse the order of the folder paths. Add your Pidgin Profile to Windows XP In XP we also had a clean install of Pidgin.  If you’ve run Pidgin, even without creating an account, you will need to first remove its settings folder.  Click Start, the Run, and enter %appdata%. Delete your .purple folder. XP does not include a way to create a symbolic link, so we will use the free Junction tool from Sysinternals.  Download Junction (link below) and unzip the folder. Open Command Prompt (click Start, select All Programs, then Accessories, and select Command Prompt), and enter cd followed by the path of the folder where you saved Junction.   Now, to create the symbolic link, enter the following in Command Prompt, substituting username with your username. junction –d “C:\Documents and Settings\username\Application Data\.purple” “C:\Documents and Settings\username\My Documents\My Dropbox\.purple” Open Pidgin, and you will see all of your settings just as they were on your other computer.  Everything’s ready to go.   Please Note: If your primary Pidgin account is in Windows XP, then cut your .purple folder and paste it into your Dropbox folder instead.  Then, when creating the symbolic link, reverse the order of the folder paths. Conclusion This is a great way to keep all of your chat and IM accounts available from all of your computers.  You can easily access logs from chats you had on your desktop from your laptop, or if you add a chat account on your work computer you can use it seamlessly from your home computer that evening.  Now Pidgin is the universal chat client that is always ready whenever and wherever you need it! Links Downlaod Pidgin Download and signup for Dropbox Download Junction for XP Similar Articles Productive Geek Tips Add "My Dropbox" to Your Windows 7 Start MenuUse Multiple Firefox Profiles at the Same TimeEasily Add Facebook Chat to PidginPut Your Pidgin Buddy List into the Windows Vista SidebarBackup and Restore Firefox Profiles Easily TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Download Free iPad Wallpapers at iPad Decor Get Your Delicious Bookmarks In Firefox’s Awesome Bar Manage Photos Across Different Social Sites With Dropico Test Drive Windows 7 Online Download Wallpapers From National Geographic Site Spyware Blaster v4.3

    Read the article

  • Tricks and Optimizations for you Sitecore website

    - by amaniar
    When working with Sitecore there are some optimizations/configurations I usually repeat in order to make my app production ready. Following is a small list I have compiled from experience, Sitecore documentation, communicating with Sitecore Engineers etc. This is not supposed to be technically complete and might not be fit for all environments.   Simple configurations that can make a difference: 1) Configure Sitecore Caches. This is the most straight forward and sure way of increasing the performance of your website. Data and item cache sizes (/databases/database/ [id=web] ) should be configured as needed. You may start with a smaller number and tune them as needed. <cacheSizes hint="setting"> <data>300MB</data> <items>300MB</items> <paths>5MB</paths> <standardValues>5MB</standardValues> </cacheSizes> Tune the html, registry etc cache sizes for your website.   <cacheSizes> <sites> <website> <html>300MB</html> <registry>1MB</registry> <viewState>10MB</viewState> <xsl>5MB</xsl> </website> </sites> </cacheSizes> Tune the prefetch cache settings under the App_Config/Prefetch/ folder. Sample /App_Config/Prefetch/Web.Config: <configuration> <cacheSize>300MB</cacheSize> <!--preload items that use this template--> <template desc="mytemplate">{XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX}</template> <!--preload this item--> <item desc="myitem">{XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX }</item> <!--preload children of this item--> <children desc="childitems">{XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX}</children> </configuration> Break your page into sublayouts so you may cache most of them. Read the caching configuration reference: http://sdn.sitecore.net/upload/sitecore6/sc62keywords/cache_configuration_reference_a4.pdf   2) Disable Analytics for the Shell Site <site name="shell" virtualFolder="/sitecore/shell" physicalFolder="/sitecore/shell" rootPath="/sitecore/content" startItem="/home" language="en" database="core" domain="sitecore" loginPage="/sitecore/login" content="master" contentStartItem="/Home" enableWorkflow="true" enableAnalytics="false" xmlControlPage="/sitecore/shell/default.aspx" browserTitle="Sitecore" htmlCacheSize="2MB" registryCacheSize="3MB" viewStateCacheSize="200KB" xslCacheSize="5MB" />   3) Increase the Check Interval for the MemoryMonitorHook so it doesn’t run every 5 seconds (default). <hook type="Sitecore.Diagnostics.MemoryMonitorHook, Sitecore.Kernel"> <param desc="Threshold">800MB</param> <param desc="Check interval">00:05:00</param> <param desc="Minimum time between log entries">00:01:00</param> <ClearCaches>false</ClearCaches> <GarbageCollect>false</GarbageCollect> <AdjustLoadFactor>false</AdjustLoadFactor> </hook>   4) Set Analytics.PeformLookup (Sitecore.Analytics.config) to false if your environment doesn’t have access to the internet or you don’t intend to use reverse DNS lookup. <setting name="Analytics.PerformLookup" value="false" />   5) Set the value of the “Media.MediaLinkPrefix” setting to “-/media”: <setting name="Media.MediaLinkPrefix" value="-/media" /> Add the following line to the customHandlers section: <customHandlers> <handler trigger="-/media/" handler="sitecore_media.ashx" /> <handler trigger="~/media/" handler="sitecore_media.ashx" /> <handler trigger="~/api/" handler="sitecore_api.ashx" /> <handler trigger="~/xaml/" handler="sitecore_xaml.ashx" /> <handler trigger="~/icon/" handler="sitecore_icon.ashx" /> <handler trigger="~/feed/" handler="sitecore_feed.ashx" /> </customHandlers> Link: http://squad.jpkeisala.com/2011/10/sitecore-media-library-performance-optimization-checklist/   6) Performance counters should be disabled in production if not being monitored <setting name="Counters.Enabled" value="false" />   7) Disable Item/Memory/Timing threshold warnings. Due to the nature of this component, it brings no value in production. <!--<processor type="Sitecore.Pipelines.HttpRequest.StartMeasurements, Sitecore.Kernel" />--> <!--<processor type="Sitecore.Pipelines.HttpRequest.StopMeasurements, Sitecore.Kernel"> <TimingThreshold desc="Milliseconds">1000</TimingThreshold> <ItemThreshold desc="Item count">1000</ItemThreshold> <MemoryThreshold desc="KB">10000</MemoryThreshold> </processor>—>   8) The ContentEditor.RenderCollapsedSections setting is a hidden setting in the web.config file, which by default is true. Setting it to false will improve client performance for authoring environments. <setting name="ContentEditor.RenderCollapsedSections" value="false" />   9) Add a machineKey section to your Web.Config file when using a web farm. Link: http://msdn.microsoft.com/en-us/library/ff649308.aspx   10) If you get errors in the log files similar to: WARN Could not create an instance of the counter 'XXX.XXX' (category: 'Sitecore.System') Exception: System.UnauthorizedAccessException Message: Access to the registry key 'Global' is denied. Make sure the ApplicationPool user is a member of the system “Performance Monitor Users” group on the server.   11) Disable WebDAV configurations on the CD Server if not being used. More: http://sitecoreblog.alexshyba.com/2011/04/disable-webdav-in-sitecore.html   12) Change Log4Net settings to only log Errors on content delivery environments to avoid unnecessary logging. <root> <priority value="ERROR" /> <appender-ref ref="LogFileAppender" /> </root>   13) Disable Analytics for any content item that doesn’t add value. For example a page that redirects to another page.   14) When using Web User Controls avoid registering them on the page the asp.net way: <%@ Register Src="~/layouts/UserControls/MyControl.ascx" TagName="MyControl" TagPrefix="uc2" %> Use Sublayout web control instead – This way Sitecore caching could be leveraged <sc:Sublayout ID="ID" Path="/layouts/UserControls/MyControl.ascx" Cacheable="true" runat="server" />   15) Avoid querying for all children recursively when all items are direct children. Sitecore.Context.Database.SelectItems("/sitecore/content/Home//*"); //Use: Sitecore.Context.Database.GetItem("/sitecore/content/Home");   16) On IIS — you enable static & dynamic content compression on CM and CD More: http://technet.microsoft.com/en-us/library/cc754668%28WS.10%29.aspx   17) Enable HTTP Keep-alive and content expiration in IIS.   18) Use GUID’s when accessing items and fields instead of names or paths. Its faster and wont break your code when things get moved or renamed. Context.Database.GetItem("{324DFD16-BD4F-4853-8FF1-D663F6422DFF}") Context.Item.Fields["{89D38A8F-394E-45B0-826B-1A826CF4046D}"]; //is better than Context.Database.GetItem("/Home/MyItem") Context.Item.Fields["FieldName"]   Hope this helps.

    Read the article

  • Why It Is So Important to Know Your Customer

    - by Christie Flanagan
    Over the years, I endured enough delayed flights, air turbulence and misadventures in airport security clearance to watch my expectations for the air travel experience fall to abysmally low levels. The extent of my loyalty to any one carrier had more to do with the proximity of the airport parking garage to their particular gate than to any effort on the airline’s part to actually earn and retain my business. That all changed one day when I found myself at the airport hoping to catch a return flight home a few hours earlier than expected, using an airline I had flown with for the first time just that week.  When you travel regularly for business, being able to catch a return flight home that’s even an hour or two earlier than originally scheduled is a big deal. It can mean the difference between having a normal evening with your family and having to sneak in like a cat burglar after everyone is fast asleep. And so I found myself on this particular day hoping to catch an earlier flight home. I approached the gate agent and was told that I could go on standby for their next flight out. Then I asked how much it was going to cost to change the flight, knowing full well that I wouldn’t get reimbursed by my company for any change fees. “Oh, there’s no charge to fly on standby,” the gate agent told me. I made a funny look. I couldn’t believe what I was hearing. This airline was going to let my fly on standby, at no additional charge, even though I was a new customer with no status or points. It had been years since I’d seen an airline pass up a short term revenue generating opportunity in favor of a long term loyalty generating one.  At that moment, this particular airline gained my loyal business. Since then, this airline has had the opportunity to learn a lot about me. They know where I live, where I fly from, where I usually fly to, and where I like to sit on the plane. In general, I’ve found their customer service to be quite good whether at the airport, via call center and even through social channels. They email me occasionally, and when they do, they demonstrate that they know me by promoting deals for flights from where I live to places that I’d be interested in visiting. And that’s part of why I’m always so puzzled when I visit their website.Does this company with the great service, customer friendly policies, and clean planes demonstrate that they know me at all when I visit their website? The answer is no. Even when I log in using my loyalty program credentials, it’s pretty obvious that they’re presenting the same old home page and same old offers to every single one of their site visitors. I mean, those promotional offers that they’re featuring so prominently  -- they’re for flights that originate thousands of miles from where I live! There’s no way I’d ever book one of those flights and I’m sure I’m not the only one of their customers to feel that way.My reason for recounting this story is not to pick on the one customer experience flaw I've noticed with this particular airline, in fact, they do so many things right that I’ll continue to fly with them. But I did want to illustrate just how glaringly obvious it is to customers today when a touch point they have with a brand is impersonal, unconnected and out of sync. As someone who’s spent a number of years in the web experience management and online marketing space, it particularly peeves me when that out of sync touch point is a brand’s website, perhaps because I know how important it is to make a customer’s online experience relevant and how many powerful tools are available for making a relevant experience a reality. The fact is, delivering a one-size-fits-all online customer experience is no longer acceptable or particularly effective in today’s world. Today’s savvy customers expect you to know who they are and to understand their preferences, behavior and relationship with your brand. Not only do they expect you to know about them, but they also expect you to demonstrate this knowledge across all of their touch points with your brand in a consistent and compelling fashion, whether it be on your traditional website, your mobile web presence or through various social channels.Delivering the kind of personalized online experiences that customers want can have tremendous business benefits. This is not just about generating feelings of goodwill and higher customer satisfaction ratings either. More relevant and personalized online experiences boost the effectiveness of online marketing initiatives and the statistics prove this out. Personalized web experiences can help increase online conversion rates by 70% -- that’s a huge number.1  And more than three quarters of consumers indicate that they’ve made additional online purchases based on personalized product recommendations.2Now if only this airline would get on board with delivering a more personalized online customer experience. I’d certainly be happier and more likely to spring for one of their promotional offers. And by targeting relevant offers on their home page to appropriate segments of their site visitors, I bet they’d be happier and generating additional revenue too. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}  ***** If you're interested in hearing more perspectives on the benefits of demonstrating that you know your customers by delivering a more personalized experience, check out this white paper on creating a successful and meaningful customer experience on the web.  Also catch the video below on the business value of CX in attracting new customers featuring Oracle's VP of Customer Experience Strategy, Brian Curran. 1 Search Engine Watch 2 Marketing Charts

    Read the article

  • Grow Your Business with Security

    - by Darin Pendergraft
    Author: Kevin Moulton Kevin Moulton has been in the security space for more than 25 years, and with Oracle for 7 years. He manages the East EnterpriseSecurity Sales Consulting Team. He is also a Distinguished Toastmaster. Follow Kevin on Twitter at twitter.com/kevin_moulton, where he sometimes tweets about security, but might also tweet about running, beer, food, baseball, football, good books, or whatever else grabs his attention. Kevin will be a regular contributor to this blog so stay tuned for more posts from him. It happened again! There I was, reading something interesting online, and realizing that a friend might find it interesting too. I clicked on the little email link, thinking that I could easily forward this to my friend, but no! Instead, a new screen popped up where I was asked to create an account. I was expected to create a User ID and password, not to mention providing some personally identifiable information, just for the privilege of helping that website spread their word. Of course, I didn’t want to have to remember a new account and password, I didn’t want to provide the requisite information, and I didn’t want to waste my time. I gave up, closed the web page, and moved on to something else. I was left with a bad taste in my mouth, and my friend might never find her way to this interesting website. If you were this content provider, would this be the outcome you were looking for? A few days later, I had a similar experience, but this one went a little differently. I was surfing the web, when I happened upon some little chotcke that I just had to have. I added it to my cart. When I went to buy the item, I was again brought to a page to create account. Groan! But wait! On this page, I also had the option to sign in with my OpenID account, my Facebook account, my Yahoo account, or my Google Account. I have all of those! No new account to create, no new password to remember, and no personally identifiable information to be given to someone else (I’ve already given it all to those other guys, after all). In this case, the vendor was easy to deal with, and I happily completed the transaction. That pleasant experience will bring me back again. This is where security can grow your business. It’s a differentiator. You’ve got to have a presence on the web, and that presence has to take into account all the smart phones everyone’s carrying, and the tablets that took over cyber Monday this year. If you are a company that a customer can deal with securely, and do so easily, then you are a company customers will come back to again and again. I recently had a need to open a new bank account. Every bank has a web presence now, but they are certainly not all the same. I wanted one that I could deal with easily using my laptop, but I also wanted 2-factor authentication in case I had to login from a shared machine, and I wanted an app for my iPad. I found a bank with all three, and that’s who I am doing business with. Let’s say, for example, that I’m in a regular Texas Hold-em game on Friday nights, so I move a couple of hundred bucks from checking to savings on Friday afternoons. I move a similar amount each week and I do it from the same machine. The bank trusts me, and they trust my machine. Most importantly, they trust my behavior. This is adaptive authentication. There should be no reason for my bank to make this transaction difficult for me. Now let's say that I login from a Starbucks in Uzbekistan, and I transfer $2,500. What should my bank do now? Should they stop the transaction? Should they call my home number? (My former bank did exactly this once when I was taking money out of an ATM on a business trip, when I had provided my cell phone number as my primary contact. When I asked them why they called my home number rather than my cell, they told me that their “policy” is to call the home number. If I'm on the road, what exactly is the use of trying to reach me at home to verify my transaction?) But, back to Uzbekistan… Should my bank assume that I am happily at home in New Jersey, and someone is trying to hack into my account? Perhaps they think they are protecting me, but I wouldn’t be very happy if I happened to be traveling on business in Central Asia. What if my bank were to automatically analyze my behavior and calculate a risk score? Clearly, this scenario would be outside of my typical behavior, so my risk score would necessitate something more than a simple login and password. Perhaps, in this case, a one-time password to my cell phone would prove that this is not just some hacker half way around the world. But, what if you're not a bank? Do you need this level of security? If you want to be a business that is easy to deal with while also protecting your customers, then of course you do. You want your customers to trust you, but you also want them to enjoy doing business with you. Make it easy for them to do business with you, and they’ll come back, and perhaps even Tweet about it, or Like you, and then their friends will follow. How can Oracle help? Oracle has the technology and expertise to help you to grown your business with security. Oracle Adaptive Access Manager will help you to prevent fraud while making it easier for your customers to do business with you by providing the risk analysis I discussed above, step-up authentication, and much more. Oracle Mobile and Social Access Service will help you to secure mobile access to applications by expanding on your existing back-end identity management infrastructure, and allowing your customers to transact business with you using the social media accounts they already know. You also have device fingerprinting and metrics to help you to grow your business securely. Security is not just a cost anymore. It’s a way to set your business apart. With Oracle’s help, you can be the business that everyone’s tweeting about. Image courtesy of Flickr user shareski

    Read the article

  • ?RAC????????????

    - by Allen Gao
    Normal 0 7.8 ? 0 2 false false false EN-US ZH-CN X-NONE DefSemiHidden="true" DefQFormat="false" DefPriority="99" LatentStyleCount="267" UnhideWhenUsed="false" QFormat="true" Name="Normal"/ UnhideWhenUsed="false" QFormat="true" Name="heading 1"/ UnhideWhenUsed="false" QFormat="true" Name="Title"/ UnhideWhenUsed="false" QFormat="true" Name="Subtitle"/ UnhideWhenUsed="false" QFormat="true" Name="Strong"/ UnhideWhenUsed="false" QFormat="true" Name="Emphasis"/ UnhideWhenUsed="false" Name="Table Grid"/ UnhideWhenUsed="false" QFormat="true" Name="No Spacing"/ UnhideWhenUsed="false" Name="Light Shading"/ UnhideWhenUsed="false" Name="Light List"/ UnhideWhenUsed="false" Name="Light Grid"/ UnhideWhenUsed="false" Name="Medium Shading 1"/ UnhideWhenUsed="false" Name="Medium Shading 2"/ UnhideWhenUsed="false" Name="Medium List 1"/ UnhideWhenUsed="false" Name="Medium List 2"/ UnhideWhenUsed="false" Name="Medium Grid 1"/ UnhideWhenUsed="false" Name="Medium Grid 2"/ UnhideWhenUsed="false" Name="Medium Grid 3"/ UnhideWhenUsed="false" Name="Dark List"/ UnhideWhenUsed="false" Name="Colorful Shading"/ UnhideWhenUsed="false" Name="Colorful List"/ UnhideWhenUsed="false" Name="Colorful Grid"/ UnhideWhenUsed="false" Name="Light Shading Accent 1"/ UnhideWhenUsed="false" Name="Light List Accent 1"/ UnhideWhenUsed="false" Name="Light Grid Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 1"/ UnhideWhenUsed="false" QFormat="true" Name="List Paragraph"/ UnhideWhenUsed="false" QFormat="true" Name="Quote"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Quote"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 1"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 1"/ UnhideWhenUsed="false" Name="Dark List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 1"/ UnhideWhenUsed="false" Name="Colorful List Accent 1"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 1"/ UnhideWhenUsed="false" Name="Light Shading Accent 2"/ UnhideWhenUsed="false" Name="Light List Accent 2"/ UnhideWhenUsed="false" Name="Light Grid Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 2"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 2"/ UnhideWhenUsed="false" Name="Dark List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 2"/ UnhideWhenUsed="false" Name="Colorful List Accent 2"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 2"/ UnhideWhenUsed="false" Name="Light Shading Accent 3"/ UnhideWhenUsed="false" Name="Light List Accent 3"/ UnhideWhenUsed="false" Name="Light Grid Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 3"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 3"/ UnhideWhenUsed="false" Name="Dark List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 3"/ UnhideWhenUsed="false" Name="Colorful List Accent 3"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 3"/ UnhideWhenUsed="false" Name="Light Shading Accent 4"/ UnhideWhenUsed="false" Name="Light List Accent 4"/ UnhideWhenUsed="false" Name="Light Grid Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 4"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 4"/ UnhideWhenUsed="false" Name="Dark List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 4"/ UnhideWhenUsed="false" Name="Colorful List Accent 4"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 4"/ UnhideWhenUsed="false" Name="Light Shading Accent 5"/ UnhideWhenUsed="false" Name="Light List Accent 5"/ UnhideWhenUsed="false" Name="Light Grid Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 5"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 5"/ UnhideWhenUsed="false" Name="Dark List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 5"/ UnhideWhenUsed="false" Name="Colorful List Accent 5"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 5"/ UnhideWhenUsed="false" Name="Light Shading Accent 6"/ UnhideWhenUsed="false" Name="Light List Accent 6"/ UnhideWhenUsed="false" Name="Light Grid Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Shading 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium List 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 1 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 2 Accent 6"/ UnhideWhenUsed="false" Name="Medium Grid 3 Accent 6"/ UnhideWhenUsed="false" Name="Dark List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Shading Accent 6"/ UnhideWhenUsed="false" Name="Colorful List Accent 6"/ UnhideWhenUsed="false" Name="Colorful Grid Accent 6"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Emphasis"/ UnhideWhenUsed="false" QFormat="true" Name="Subtle Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Intense Reference"/ UnhideWhenUsed="false" QFormat="true" Name="Book Title"/ /* Style Definitions */ table.MsoNormalTable {mso-style-name:????; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.5pt; mso-bidi-font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:1.0pt;} ????????????????????????????????????????,??????????????Oracle RAC?????????????????????????????,???????????????????,??????RAC???????????,????????????????????????????????????,????3???RAC????????????? ????????MOS ??"Top 11 Things to do NOW to Stabilize your RAC Cluster Environment”(DOC ID 1344678.1)???,???,??????3???????????,?????????????????????????????,???,?????????????????,??,??????????????,??????????????????????,?????????????,??????????????????????,???????RAC DBA???? ??????? (PSU)??,?????????PSU? ???????????,???????Oracle???????????(PSU)???PSU?????????????????,??PSU???????????????????PSU????????,???????????????PSU,????????6????????????????????BUG????,??????????,?????????????????????,???9???,???RAC???(Cluster)BUG,??7%??BUG??????,??????????BUG??????????????????????PSU??????RAC???,PSU????????: PSU?????Grid Infrastructure(GI)home,???????????RDBMS home???????,??GI home????PSU,?????home?????,??????????GI????????????,??????,??RDBMS PSU,GI PSU??????????GI home??????PSU,???????RDBMS??PSU? RAC????PSU????rolling????? –?????????GI? RDBMS?????????????????,??PSU???????,???????????????? ???????????PSU,????????????,?????????PSU????,???RAC?????????????PSU???,???????????????????? ??PSU?????, ??????MOS??: NOTE 854428.1   Intro to Patch Set Updates (PSU) NOTE 1082394.1 11.2.0.X Grid Infrastructure PSU Known Issues NOTE 756671.1   Oracle Recommended Patches -- Oracle Database NOTE 161549.1   Oracle Database, Networking and Grid Agent Patches for Microsoft Platforms NOTE 810394.1   RAC and Oracle Clusterware Best Practices and Starter Kit 11gR2???????,?Diagwait???13? ?2012?,??45%????????11gR2???????,????diagwait?13????RAC???????????,????diagwait??????????????,????????????????, diagwait??RAC?????????????: ?????,??????OPROCD?????1??0.5?????,????,??OPROCD??? 1.5????,?????????diagwait????13??OPROCD??????????10?( diagwait - CSS????[???3?]),????????OPROCD???????????????'?'?????????????,1.5??????????????????OPROCD?????????????11?(1?????+10????)? ?????/???????,??diagwait,??????????????????????,??,????????????? ?11g?2?(11.2.0.1?????)??,?????????????,???????,??????????????????,????????????????,?????????????????????diagwait????????,????????????????????,????????Oracle?????(OCR),?????????OCR???????????,?????????diagwai?????????????????: # $CLUSTERWARE_HOME\bin\crsctl get css diagwait ????DIAGWAIT???,??????MOS??: NOTE 567730.1  Changes in Oracle Clusterware on Linux with the 10.2.0.4 Patchset NOTE 559365.1  Using Diagwait as a diagnostic to get more information for diagnosing Oracle Clusterware Node evictions NOTE 810394.1 RAC and Oracle Clusterware Best Practices and Starter Kit ??OS Watcher Black Box(OSWbb) ? Cluster Health Monitor(CHM) ????????OS??????????????,??,??????OS Watcher Black Box(OSWbb)(??OS Watcher)?Cluster Health Monitor(CHM)????????OS???,??DBA????????????????????????????,?????????????,??????????,?????????????????????????OS????????,????????????,???????????????????? OSWbb?????????,??????,????OS??????????????,????OS??????OSWbb???????: ?????,??30??????????OS?????????????(??5??)????????????????????,?1???5????????????????????????30????????,Oracle???????????????OS?????????????,Oracle??????OSWbb?20???????? OSWbb?????????????????Oracle???????????????????OS????,??,?????????????????????????Oracle???????,?????????????,????????????????? ???11.2.0.3??,??????(HP-UX??)?,Oracle GI?????????,Cluster Health Monitor (CHM)?CHM??????,?????OSW????,??,???????OSW????,?????????? Oracle??????????????????OSWbb?/?CHM,?????????,????????????????????,??????????OSWbb,???????????RAC??,??????????????????(???NOTE 580513.1“How To Start OS Watcher Black Box Every System Boot”??????)? ??OSWbb?CHM?????, ??????MOS??: NOTE 301137.1   OS Watcher Black Box User Guide NOTE 1328466.1 Cluster Health Monitor (CHM) FAQ NOTE 810394.1   RAC and Oracle Clusterware Best Practices and Starter Kit ?? ?????????RAC/ Oracle?????????????3???????????3?,?????RAC??????,?????????????????,?????MOS??: NOTE 1344678.1 Top 11 Things to do NOW to Stabilize your RAC Cluster Environment ????,???MOS-RAC/Scalability community??,?Oracle???????????,????RAC/ Oracle?????

    Read the article

  • How to set a imageButton is an RSS

    - by L?c Song
    I have a feed_layout.xml <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="fill_parent" android:layout_height="fill_parent" android:orientation="vertical" > <LinearLayout android:baselineAligned="false" android:layout_width="fill_parent" android:layout_height="wrap_content" android:layout_marginTop="10dp" android:orientation="horizontal" > <LinearLayout android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:orientation="vertical" > <ImageButton android:layout_width="138dp" android:layout_height="138dp" android:layout_gravity="center" android:onClick="homeImageButton" android:scaleType="fitStart" android:src="@drawable/home" android:tag="1" /> </LinearLayout> <LinearLayout android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:orientation="vertical" > <ImageButton android:layout_width="138dp" android:layout_height="138dp" android:layout_gravity="center" android:scaleType="centerCrop" android:onClick="thegioiImageButton" android:src="@drawable/home" android:tag="2" /> </LinearLayout> </LinearLayout> <LinearLayout android:baselineAligned="false" android:layout_width="fill_parent" android:layout_height="wrap_content" android:layout_marginTop="10dp" android:orientation="horizontal" > <LinearLayout android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:orientation="vertical" > <ImageButton android:layout_width="138dp" android:layout_height="138dp" android:layout_gravity="center" android:scaleType="centerCrop" android:onClick="giaitriImageButton" android:src="@drawable/home" android:tag="3" /> </LinearLayout> <LinearLayout android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:orientation="vertical" > <ImageButton android:layout_width="138dp" android:layout_height="138dp" android:layout_gravity="center" android:scaleType="centerCrop" android:onClick="thethaoImageButton" android:src="@drawable/home" android:tag="4" /> </LinearLayout> </LinearLayout> <LinearLayout android:baselineAligned="false" android:layout_width="fill_parent" android:layout_height="wrap_content" android:layout_marginTop="10dp" android:orientation="horizontal" > <LinearLayout android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:orientation="vertical" > <ImageButton android:layout_width="138dp" android:layout_height="138dp" android:layout_gravity="center" android:scaleType="centerCrop" android:onClick="khoahocImageButton" android:src="@drawable/home" android:tag="5" /> </LinearLayout> <LinearLayout android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:orientation="vertical" > <ImageButton android:layout_width="138dp" android:layout_height="138dp" android:layout_gravity="center" android:scaleType="centerCrop" android:onClick="xeImageButton" android:src="@drawable/home" android:tag="6" /> </LinearLayout> </LinearLayout> </LinearLayout> and feedActivity.java package com.dqh.vnexpressrssreader; import android.R.string; import android.app.Activity; import android.content.Intent; import android.os.Bundle; import android.util.Log; import android.view.View; import android.widget.ImageButton; import android.widget.Toast; public class FeedActivity extends Activity { public String tagImg; private static final String TAG = "FeedActivity"; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.feed_layout); } public void homeImageButton(View v) { ImageButton imageButtonClicked = (ImageButton)v; tagImg = imageButtonClicked.getTag().toString(); setTagImg(tagImg); String tt = getTagImg(); Log.d(TAG, "FeedId: " + tt); Intent intent = new Intent(getApplicationContext(), ItemsActivity.class); startActivityForResult(intent, 0); } public void thegioiImageButton(View v) { ImageButton imageButtonClicked = (ImageButton)v; tagImg = imageButtonClicked.getTag().toString(); //Log.d(TAG, "FeedId: " + imageButtonClicked.getTag()); Log.d(TAG, "FeedId: " + tagImg); Intent intent = new Intent(getApplicationContext(), ItemsActivity.class); startActivityForResult(intent, 0); } } and RssReader.java /** * */ package com.dqh.vnexpressrssreader.reader; import java.util.ArrayList; import java.util.List; import org.json.JSONException; import org.json.JSONObject; import com.dqh.vnexpressrssreader.FeedActivity; import com.dqh.vnexpressrssreader.NewsRssReaderDB; import com.dqh.vnexpressrssreader.util.RSSHandler; import com.dqh.vnexpressrssreader.util.Tintuc; import android.content.Context; import android.text.Html; import android.util.Log; /** * @author rob * */ public class RssReader { private final static String TAG = "RssReader"; private final static String BOLD_OPEN = "<B>"; private final static String BOLD_CLOSE = "</B>"; private final static String BREAK = "<BR>"; private final static String ITALIC_OPEN = "<I>"; private final static String ITALIC_CLOSE = "</I>"; private final static String SMALL_OPEN = "<SMALL>"; private final static String SMALL_CLOSE = "</SMALL>"; /** * This method defines a feed URL and then calles our SAX Handler to read the tintuc list * from the stream * * @return List<JSONObject> - suitable for the List View activity */ public static List<JSONObject> getLatestRssFeed(Context context) { NewsRssReaderDB newsRssReaderDB = new NewsRssReaderDB(context); List<Tintuc> tintucsFromDB = newsRssReaderDB.getLists(); return fillData(tintucsFromDB); } public static void getLatestRssFeed(Context context, String feed) { NewsRssReaderDB newsRssReaderDB = new NewsRssReaderDB(context); feed = "http://vnexpress.net/rss/the-gioi.rss"; //RSS 2 feed = "http://vnexpress.net/rss/the-thao.rss"; //RSS 3 feed = "http://vnexpress.net/rss/home.rss"; RSSHandler rh = new RSSHandler(); List<Tintuc> tintucs = rh.getLatestTintucs(feed); if ((tintucs != null) && (tintucs.size() > 0)) { for (Tintuc tintuc : tintucs) { if ((tintuc.getUrl() != null) && !newsRssReaderDB.checkUrlExist(tintuc.getUrl().toString())) { long tintucId = newsRssReaderDB.insertTintuc(tintuc); if (tintucId > 0) { Log.d(TAG, "saved tintucId: " + tintucId); } else { Log.e(TAG, "saved tintucId fail"); } } else { Log.e(TAG, "tintucs exist!"); } } } } /** * This method takes a list of Tintuc objects and converts them in to the * correct JSON format so the info can be processed by our list view * * @param tintucs - list<Tintuc> * @return List<JSONObject> - suitable for the List View activity */ private static List<JSONObject> fillData(List<Tintuc> tintucs) { List<JSONObject> items = new ArrayList<JSONObject>(); for (Tintuc tintuc : tintucs) { JSONObject current = new JSONObject(); try { buildJsonObject(tintuc, current); } catch (JSONException e) { Log.e("RSS ERROR", "Error creating JSON Object from RSS feed"); } items.add(current); } return items; } /** * This method takes a single Tintuc Object and converts it in to a single JSON object * including some additional HTML formating so they can be displayed nicely * * @param tintuc * @param current * @throws JSONException */ private static void buildJsonObject(Tintuc tintuc, JSONObject current) throws JSONException { String title = tintuc.getTieude(); String description = tintuc.getMota(); ///////////////////////// //////// 2 ///////////// String date = tintuc.getPubDate(); String imgLink = tintuc.getImgLink(); StringBuffer sb = new StringBuffer(); sb.append(BOLD_OPEN).append(title).append(BOLD_CLOSE); sb.append(BREAK); sb.append(description); sb.append(BREAK); sb.append(SMALL_OPEN).append(ITALIC_OPEN).append(date).append(ITALIC_CLOSE).append(SMALL_CLOSE); current.put("text", Html.fromHtml(sb.toString())); current.put("imageLink", imgLink); current.put("url", tintuc.getUrl().toString()); current.put("title", tintuc.getTieude()); } } I have 1 array RSS and I want each ImageButton is assigned a Rss??. I have attempt to call to FeedActivity from RSSReader but not be help me !

    Read the article

  • Upload File to Windows Azure Blob in Chunks through ASP.NET MVC, JavaScript and HTML5

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2013/07/01/upload-file-to-windows-azure-blob-in-chunks-through-asp.net.aspxMany people are using Windows Azure Blob Storage to store their data in the cloud. Blob storage provides 99.9% availability with easy-to-use API through .NET SDK and HTTP REST. For example, we can store JavaScript files, images, documents in blob storage when we are building an ASP.NET web application on a Web Role in Windows Azure. Or we can store our VHD files in blob and mount it as a hard drive in our cloud service. If you are familiar with Windows Azure, you should know that there are two kinds of blob: page blob and block blob. The page blob is optimized for random read and write, which is very useful when you need to store VHD files. The block blob is optimized for sequential/chunk read and write, which has more common usage. Since we can upload block blob in blocks through BlockBlob.PutBlock, and them commit them as a whole blob with invoking the BlockBlob.PutBlockList, it is very powerful to upload large files, as we can upload blocks in parallel, and provide pause-resume feature. There are many documents, articles and blog posts described on how to upload a block blob. Most of them are focus on the server side, which means when you had received a big file, stream or binaries, how to upload them into blob storage in blocks through .NET SDK.  But the problem is, how can we upload these large files from client side, for example, a browser. This questioned to me when I was working with a Chinese customer to help them build a network disk production on top of azure. The end users upload their files from the web portal, and then the files will be stored in blob storage from the Web Role. My goal is to find the best way to transform the file from client (end user’s machine) to the server (Web Role) through browser. In this post I will demonstrate and describe what I had done, to upload large file in chunks with high speed, and save them as blocks into Windows Azure Blob Storage.   Traditional Upload, Works with Limitation The simplest way to implement this requirement is to create a web page with a form that contains a file input element and a submit button. 1: @using (Html.BeginForm("About", "Index", FormMethod.Post, new { enctype = "multipart/form-data" })) 2: { 3: <input type="file" name="file" /> 4: <input type="submit" value="upload" /> 5: } And then in the backend controller, we retrieve the whole content of this file and upload it in to the blob storage through .NET SDK. We can split the file in blocks and upload them in parallel and commit. The code had been well blogged in the community. 1: [HttpPost] 2: public ActionResult About(HttpPostedFileBase file) 3: { 4: var container = _client.GetContainerReference("test"); 5: container.CreateIfNotExists(); 6: var blob = container.GetBlockBlobReference(file.FileName); 7: var blockDataList = new Dictionary<string, byte[]>(); 8: using (var stream = file.InputStream) 9: { 10: var blockSizeInKB = 1024; 11: var offset = 0; 12: var index = 0; 13: while (offset < stream.Length) 14: { 15: var readLength = Math.Min(1024 * blockSizeInKB, (int)stream.Length - offset); 16: var blockData = new byte[readLength]; 17: offset += stream.Read(blockData, 0, readLength); 18: blockDataList.Add(Convert.ToBase64String(BitConverter.GetBytes(index)), blockData); 19:  20: index++; 21: } 22: } 23:  24: Parallel.ForEach(blockDataList, (bi) => 25: { 26: blob.PutBlock(bi.Key, new MemoryStream(bi.Value), null); 27: }); 28: blob.PutBlockList(blockDataList.Select(b => b.Key).ToArray()); 29:  30: return RedirectToAction("About"); 31: } This works perfect if we selected an image, a music or a small video to upload. But if I selected a large file, let’s say a 6GB HD-movie, after upload for about few minutes the page will be shown as below and the upload will be terminated. In ASP.NET there is a limitation of request length and the maximized request length is defined in the web.config file. It’s a number which less than about 4GB. So if we want to upload a really big file, we cannot simply implement in this way. Also, in Windows Azure, a cloud service network load balancer will terminate the connection if exceed the timeout period. From my test the timeout looks like 2 - 3 minutes. Hence, when we need to upload a large file we cannot just use the basic HTML elements. Besides the limitation mentioned above, the simple HTML file upload cannot provide rich upload experience such as chunk upload, pause and pause-resume. So we need to find a better way to upload large file from the client to the server.   Upload in Chunks through HTML5 and JavaScript In order to break those limitation mentioned above we will try to upload the large file in chunks. This takes some benefit to us such as - No request size limitation: Since we upload in chunks, we can define the request size for each chunks regardless how big the entire file is. - No timeout problem: The size of chunks are controlled by us, which means we should be able to make sure request for each chunk upload will not exceed the timeout period of both ASP.NET and Windows Azure load balancer. It was a big challenge to upload big file in chunks until we have HTML5. There are some new features and improvements introduced in HTML5 and we will use them to implement our solution.   In HTML5, the File interface had been improved with a new method called “slice”. It can be used to read part of the file by specifying the start byte index and the end byte index. For example if the entire file was 1024 bytes, file.slice(512, 768) will read the part of this file from the 512nd byte to 768th byte, and return a new object of interface called "Blob”, which you can treat as an array of bytes. In fact,  a Blob object represents a file-like object of immutable, raw data. The File interface is based on Blob, inheriting blob functionality and expanding it to support files on the user's system. For more information about the Blob please refer here. File and Blob is very useful to implement the chunk upload. We will use File interface to represent the file the user selected from the browser and then use File.slice to read the file in chunks in the size we wanted. For example, if we wanted to upload a 10MB file with 512KB chunks, then we can read it in 512KB blobs by using File.slice in a loop.   Assuming we have a web page as below. User can select a file, an input box to specify the block size in KB and a button to start upload. 1: <div> 2: <input type="file" id="upload_files" name="files[]" /><br /> 3: Block Size: <input type="number" id="block_size" value="512" name="block_size" />KB<br /> 4: <input type="button" id="upload_button_blob" name="upload" value="upload (blob)" /> 5: </div> Then we can have the JavaScript function to upload the file in chunks when user clicked the button. 1: <script type="text/javascript"> 1: 2: $(function () { 3: $("#upload_button_blob").click(function () { 4: }); 5: });</script> Firstly we need to ensure the client browser supports the interfaces we are going to use. Just try to invoke the File, Blob and FormData from the “window” object. If any of them is “undefined” the condition result will be “false” which means your browser doesn’t support these premium feature and it’s time for you to get your browser updated. FormData is another new feature we are going to use in the future. It could generate a temporary form for us. We will use this interface to create a form with chunk and associated metadata when invoked the service through ajax. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: if (window.File && window.Blob && window.FormData) { 4: alert("Your brwoser is awesome, let's rock!"); 5: } 6: else { 7: alert("Oh man plz update to a modern browser before try is cool stuff out."); 8: return; 9: } 10: }); Each browser supports these interfaces by their own implementation and currently the Blob, File and File.slice are supported by Chrome 21, FireFox 13, IE 10, Opera 12 and Safari 5.1 or higher. After that we worked on the files the user selected one by one since in HTML5, user can select multiple files in one file input box. 1: var files = $("#upload_files")[0].files; 2: for (var i = 0; i < files.length; i++) { 3: var file = files[i]; 4: var fileSize = file.size; 5: var fileName = file.name; 6: } Next, we calculated the start index and end index for each chunks based on the size the user specified from the browser. We put them into an array with the file name and the index, which will be used when we upload chunks into Windows Azure Blob Storage as blocks since we need to specify the target blob name and the block index. At the same time we will store the list of all indexes into another variant which will be used to commit blocks into blob in Azure Storage once all chunks had been uploaded successfully. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10:  11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: var blockSizeInKB = $("#block_size").val(); 14: var blockSize = blockSizeInKB * 1024; 15: var blocks = []; 16: var offset = 0; 17: var index = 0; 18: var list = ""; 19: while (offset < fileSize) { 20: var start = offset; 21: var end = Math.min(offset + blockSize, fileSize); 22:  23: blocks.push({ 24: name: fileName, 25: index: index, 26: start: start, 27: end: end 28: }); 29: list += index + ","; 30:  31: offset = end; 32: index++; 33: } 34: } 35: }); Now we have all chunks’ information ready. The next step should be upload them one by one to the server side, and at the server side when received a chunk it will upload as a block into Blob Storage, and finally commit them with the index list through BlockBlobClient.PutBlockList. But since all these invokes are ajax calling, which means not synchronized call. So we need to introduce a new JavaScript library to help us coordinate the asynchronize operation, which named “async.js”. You can download this JavaScript library here, and you can find the document here. I will not explain this library too much in this post. We will put all procedures we want to execute as a function array, and pass into the proper function defined in async.js to let it help us to control the execution sequence, in series or in parallel. Hence we will define an array and put the function for chunk upload into this array. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4:  5: // start to upload each files in chunks 6: var files = $("#upload_files")[0].files; 7: for (var i = 0; i < files.length; i++) { 8: var file = files[i]; 9: var fileSize = file.size; 10: var fileName = file.name; 11: // calculate the start and end byte index for each blocks(chunks) 12: // with the index, file name and index list for future using 13: ... ... 14:  15: // define the function array and push all chunk upload operation into this array 16: blocks.forEach(function (block) { 17: putBlocks.push(function (callback) { 18: }); 19: }); 20: } 21: }); 22: }); As you can see, I used File.slice method to read each chunks based on the start and end byte index we calculated previously, and constructed a temporary HTML form with the file name, chunk index and chunk data through another new feature in HTML5 named FormData. Then post this form to the backend server through jQuery.ajax. This is the key part of our solution. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: blocks.forEach(function (block) { 15: putBlocks.push(function (callback) { 16: // load blob based on the start and end index for each chunks 17: var blob = file.slice(block.start, block.end); 18: // put the file name, index and blob into a temporary from 19: var fd = new FormData(); 20: fd.append("name", block.name); 21: fd.append("index", block.index); 22: fd.append("file", blob); 23: // post the form to backend service (asp.net mvc controller action) 24: $.ajax({ 25: url: "/Home/UploadInFormData", 26: data: fd, 27: processData: false, 28: contentType: "multipart/form-data", 29: type: "POST", 30: success: function (result) { 31: if (!result.success) { 32: alert(result.error); 33: } 34: callback(null, block.index); 35: } 36: }); 37: }); 38: }); 39: } 40: }); Then we will invoke these functions one by one by using the async.js. And once all functions had been executed successfully I invoked another ajax call to the backend service to commit all these chunks (blocks) as the blob in Windows Azure Storage. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.series(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); That’s all in the client side. The outline of our logic would be - Calculate the start and end byte index for each chunks based on the block size. - Defined the functions of reading the chunk form file and upload the content to the backend service through ajax. - Execute the functions defined in previous step with “async.js”. - Commit the chunks by invoking the backend service in Windows Azure Storage finally.   Save Chunks as Blocks into Blob Storage In above we finished the client size JavaScript code. It uploaded the file in chunks to the backend service which we are going to implement in this step. We will use ASP.NET MVC as our backend service, and it will receive the chunks, upload into Windows Azure Bob Storage in blocks, then finally commit as one blob. As in the client side we uploaded chunks by invoking the ajax call to the URL "/Home/UploadInFormData", I created a new action under the Index controller and it only accepts HTTP POST request. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: } 8: catch (Exception e) 9: { 10: error = e.ToString(); 11: } 12:  13: return new JsonResult() 14: { 15: Data = new 16: { 17: success = string.IsNullOrWhiteSpace(error), 18: error = error 19: } 20: }; 21: } Then I retrieved the file name, index and the chunk content from the Request.Form object, which was passed from our client side. And then, used the Windows Azure SDK to create a blob container (in this case we will use the container named “test”.) and create a blob reference with the blob name (same as the file name). Then uploaded the chunk as a block of this blob with the index, since in Blob Storage each block must have an index (ID) associated with so that finally we can put all blocks as one blob by specifying their block ID list. 1: [HttpPost] 2: public JsonResult UploadInFormData() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var index = int.Parse(Request.Form["index"]); 9: var file = Request.Files[0]; 10: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 11:  12: var container = _client.GetContainerReference("test"); 13: container.CreateIfNotExists(); 14: var blob = container.GetBlockBlobReference(name); 15: blob.PutBlock(id, file.InputStream, null); 16: } 17: catch (Exception e) 18: { 19: error = e.ToString(); 20: } 21:  22: return new JsonResult() 23: { 24: Data = new 25: { 26: success = string.IsNullOrWhiteSpace(error), 27: error = error 28: } 29: }; 30: } Next, I created another action to commit the blocks into blob once all chunks had been uploaded. Similarly, I retrieved the blob name from the Request.Form. I also retrieved the chunks ID list, which is the block ID list from the Request.Form in a string format, split them as a list, then invoked the BlockBlob.PutBlockList method. After that our blob will be shown in the container and ready to be download. 1: [HttpPost] 2: public JsonResult Commit() 3: { 4: var error = string.Empty; 5: try 6: { 7: var name = Request.Form["name"]; 8: var list = Request.Form["list"]; 9: var ids = list 10: .Split(',') 11: .Where(id => !string.IsNullOrWhiteSpace(id)) 12: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 13: .ToArray(); 14:  15: var container = _client.GetContainerReference("test"); 16: container.CreateIfNotExists(); 17: var blob = container.GetBlockBlobReference(name); 18: blob.PutBlockList(ids); 19: } 20: catch (Exception e) 21: { 22: error = e.ToString(); 23: } 24:  25: return new JsonResult() 26: { 27: Data = new 28: { 29: success = string.IsNullOrWhiteSpace(error), 30: error = error 31: } 32: }; 33: } Now we finished all code we need. The whole process of uploading would be like this below. Below is the full client side JavaScript code. 1: <script type="text/javascript" src="~/Scripts/async.js"></script> 2: <script type="text/javascript"> 3: $(function () { 4: $("#upload_button_blob").click(function () { 5: // assert the browser support html5 6: if (window.File && window.Blob && window.FormData) { 7: alert("Your brwoser is awesome, let's rock!"); 8: } 9: else { 10: alert("Oh man plz update to a modern browser before try is cool stuff out."); 11: return; 12: } 13:  14: // start to upload each files in chunks 15: var files = $("#upload_files")[0].files; 16: for (var i = 0; i < files.length; i++) { 17: var file = files[i]; 18: var fileSize = file.size; 19: var fileName = file.name; 20:  21: // calculate the start and end byte index for each blocks(chunks) 22: // with the index, file name and index list for future using 23: var blockSizeInKB = $("#block_size").val(); 24: var blockSize = blockSizeInKB * 1024; 25: var blocks = []; 26: var offset = 0; 27: var index = 0; 28: var list = ""; 29: while (offset < fileSize) { 30: var start = offset; 31: var end = Math.min(offset + blockSize, fileSize); 32:  33: blocks.push({ 34: name: fileName, 35: index: index, 36: start: start, 37: end: end 38: }); 39: list += index + ","; 40:  41: offset = end; 42: index++; 43: } 44:  45: // define the function array and push all chunk upload operation into this array 46: var putBlocks = []; 47: blocks.forEach(function (block) { 48: putBlocks.push(function (callback) { 49: // load blob based on the start and end index for each chunks 50: var blob = file.slice(block.start, block.end); 51: // put the file name, index and blob into a temporary from 52: var fd = new FormData(); 53: fd.append("name", block.name); 54: fd.append("index", block.index); 55: fd.append("file", blob); 56: // post the form to backend service (asp.net mvc controller action) 57: $.ajax({ 58: url: "/Home/UploadInFormData", 59: data: fd, 60: processData: false, 61: contentType: "multipart/form-data", 62: type: "POST", 63: success: function (result) { 64: if (!result.success) { 65: alert(result.error); 66: } 67: callback(null, block.index); 68: } 69: }); 70: }); 71: }); 72:  73: // invoke the functions one by one 74: // then invoke the commit ajax call to put blocks into blob in azure storage 75: async.series(putBlocks, function (error, result) { 76: var data = { 77: name: fileName, 78: list: list 79: }; 80: $.post("/Home/Commit", data, function (result) { 81: if (!result.success) { 82: alert(result.error); 83: } 84: else { 85: alert("done!"); 86: } 87: }); 88: }); 89: } 90: }); 91: }); 92: </script> And below is the full ASP.NET MVC controller code. 1: public class HomeController : Controller 2: { 3: private CloudStorageAccount _account; 4: private CloudBlobClient _client; 5:  6: public HomeController() 7: : base() 8: { 9: _account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("DataConnectionString")); 10: _client = _account.CreateCloudBlobClient(); 11: } 12:  13: public ActionResult Index() 14: { 15: ViewBag.Message = "Modify this template to jump-start your ASP.NET MVC application."; 16:  17: return View(); 18: } 19:  20: [HttpPost] 21: public JsonResult UploadInFormData() 22: { 23: var error = string.Empty; 24: try 25: { 26: var name = Request.Form["name"]; 27: var index = int.Parse(Request.Form["index"]); 28: var file = Request.Files[0]; 29: var id = Convert.ToBase64String(BitConverter.GetBytes(index)); 30:  31: var container = _client.GetContainerReference("test"); 32: container.CreateIfNotExists(); 33: var blob = container.GetBlockBlobReference(name); 34: blob.PutBlock(id, file.InputStream, null); 35: } 36: catch (Exception e) 37: { 38: error = e.ToString(); 39: } 40:  41: return new JsonResult() 42: { 43: Data = new 44: { 45: success = string.IsNullOrWhiteSpace(error), 46: error = error 47: } 48: }; 49: } 50:  51: [HttpPost] 52: public JsonResult Commit() 53: { 54: var error = string.Empty; 55: try 56: { 57: var name = Request.Form["name"]; 58: var list = Request.Form["list"]; 59: var ids = list 60: .Split(',') 61: .Where(id => !string.IsNullOrWhiteSpace(id)) 62: .Select(id => Convert.ToBase64String(BitConverter.GetBytes(int.Parse(id)))) 63: .ToArray(); 64:  65: var container = _client.GetContainerReference("test"); 66: container.CreateIfNotExists(); 67: var blob = container.GetBlockBlobReference(name); 68: blob.PutBlockList(ids); 69: } 70: catch (Exception e) 71: { 72: error = e.ToString(); 73: } 74:  75: return new JsonResult() 76: { 77: Data = new 78: { 79: success = string.IsNullOrWhiteSpace(error), 80: error = error 81: } 82: }; 83: } 84: } And if we selected a file from the browser we will see our application will upload chunks in the size we specified to the server through ajax call in background, and then commit all chunks in one blob. Then we can find the blob in our Windows Azure Blob Storage.   Optimized by Parallel Upload In previous example we just uploaded our file in chunks. This solved the problem that ASP.NET MVC request content size limitation as well as the Windows Azure load balancer timeout. But it might introduce the performance problem since we uploaded chunks in sequence. In order to improve the upload performance we could modify our client side code a bit to make the upload operation invoked in parallel. The good news is that, “async.js” library provides the parallel execution function. If you remembered the code we invoke the service to upload chunks, it utilized “async.series” which means all functions will be executed in sequence. Now we will change this code to “async.parallel”. This will invoke all functions in parallel. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallel(putBlocks, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: }); In this way all chunks will be uploaded to the server side at the same time to maximize the bandwidth usage. This should work if the file was not very large and the chunk size was not very small. But for large file this might introduce another problem that too many ajax calls are sent to the server at the same time. So the best solution should be, upload the chunks in parallel with maximum concurrency limitation. The code below specified the concurrency limitation to 4, which means at the most only 4 ajax calls could be invoked at the same time. 1: $("#upload_button_blob").click(function () { 2: // assert the browser support html5 3: ... ... 4: // start to upload each files in chunks 5: var files = $("#upload_files")[0].files; 6: for (var i = 0; i < files.length; i++) { 7: var file = files[i]; 8: var fileSize = file.size; 9: var fileName = file.name; 10: // calculate the start and end byte index for each blocks(chunks) 11: // with the index, file name and index list for future using 12: ... ... 13: // define the function array and push all chunk upload operation into this array 14: ... ... 15: // invoke the functions one by one 16: // then invoke the commit ajax call to put blocks into blob in azure storage 17: async.parallelLimit(putBlocks, 4, function (error, result) { 18: var data = { 19: name: fileName, 20: list: list 21: }; 22: $.post("/Home/Commit", data, function (result) { 23: if (!result.success) { 24: alert(result.error); 25: } 26: else { 27: alert("done!"); 28: } 29: }); 30: }); 31: } 32: });   Summary In this post we discussed how to upload files in chunks to the backend service and then upload them into Windows Azure Blob Storage in blocks. We focused on the frontend side and leverage three new feature introduced in HTML 5 which are - File.slice: Read part of the file by specifying the start and end byte index. - Blob: File-like interface which contains the part of the file content. - FormData: Temporary form element that we can pass the chunk alone with some metadata to the backend service. Then we discussed the performance consideration of chunk uploading. Sequence upload cannot provide maximized upload speed, but the unlimited parallel upload might crash the browser and server if too many chunks. So we finally came up with the solution to upload chunks in parallel with the concurrency limitation. We also demonstrated how to utilize “async.js” JavaScript library to help us control the asynchronize call and the parallel limitation.   Regarding the chunk size and the parallel limitation value there is no “best” value. You need to test vary composition and find out the best one for your particular scenario. It depends on the local bandwidth, client machine cores and the server side (Windows Azure Cloud Service Virtual Machine) cores, memory and bandwidth. Below is one of my performance test result. The client machine was Windows 8 IE 10 with 4 cores. I was using Microsoft Cooperation Network. The web site was hosted on Windows Azure China North data center (in Beijing) with one small web role (1.7GB 1 core CPU, 1.75GB memory with 100Mbps bandwidth). The test cases were - Chunk size: 512KB, 1MB, 2MB, 4MB. - Upload Mode: Sequence, parallel (unlimited), parallel with limit (4 threads, 8 threads). - Chunk Format: base64 string, binaries. - Target file: 100MB. - Each case was tested 3 times. Below is the test result chart. Some thoughts, but not guidance or best practice: - Parallel gets better performance than series. - No significant performance improvement between parallel 4 threads and 8 threads. - Transform with binaries provides better performance than base64. - In all cases, chunk size in 1MB - 2MB gets better performance.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Jolicloud is a Nifty New OS for Your Netbook

    - by Matthew Guay
    Want to breathe new life into your netbook?  Here’s a quick look at Jolicloud, a unique new Linux based OS that lets you use your netbook in a whole new way. Netbooks have been an interesting category of computers.  When they were first released, most netbooks came with a stripped down Linux based operating system designed to let you easily access the internet first and foremost.  Consumers wanted more from their netbooks, so full OSes such as Windows XP and Ubuntu became the standard on netbooks.  Microsoft worked hard to get Windows 7 working great on netbooks, and today most netbooks run Windows 7 great.  But the Linux community hasn’t stood still either, and Jolicloud is proof of that.  Jolicloud is a unique OS designed to bring the best of both webapps and standard programs to your netbook.   Keep reading to see if this is the perfect netbook OS for you. Getting Started Installing Jolicloud on your netbook is easy thanks to a the Jolicloud Express installer for Windows.  Since many netbooks run Windows by default, this makes it easy to install Jolicloud.  Plus, your Windows install is left untouched, so you can still easily access all your Windows files and programs. Download and run the roughly 700Mb installer (link below) just as a normal installer in Windows. This will first extract the needed files. Click Get started to install Jolicloud on your netbook. Enter a username, password, and nickname for your computer.  Please note that the username must be all lowercase, and the nickname should not contain spaces or special characters.   Now you can review the default installation settings.  By default it will take up 39Gb and install on your C:\ drive in English.  If you wish to change this, click Change. We chose to install it on the D: drive on this netbook, as its harddrive was already partitioned into two parts.  Click Save when your settings are all correct, and then click Next in the previous window. Jolicloud will prepare for the installation.  This took about 5 minutes in our test.  Click Next when this is finished. Click Restart now to install and run Jolicloud. When your netbook reboots, it will initialize the Jolicloud setup. It will then automatically finish the installation.  Just sit back and wait; there’s nothing for you to do right now.  The installation took about 20 minutes in our test. Jolicloud will automatically reboot when the setup is finished. Once it’s rebooted, you’re ready to go!  Enter the username, then the password, that you chose earlier when you were installing Jolicloud from Windows. Welcome to your Jolicloud desktop! Hardware Support We installed Jolicloud on a Samsung N150 netbook with an Atom N450 processor, 1Gb Ram, 250Gb harddrive, and WiFi b/g/n with Bluetooth.  Amazingly, once Jolicloud was installed, everything was ready to use.  No drivers to install, no settings to hassle with, it was all installed and set up perfectly.  Power settings worked great, and closing the netbook put it to sleep just like in Windows. WiFi drivers have typically been difficult to find and install on Linux, but Jolicloud had our netbook’s wifi working immediately.  To get online, simply click the Wireless icon on the top right, and select the wireless network you want to connect to. Jolicloud will let you know when it is signed on. Wired Lan networking was also seamless; simply connect your cable and you’re ready to go.  The webcam and touchpad also worked perfectly directly.  The only thing missing was multitouch; this touchpad has two finger scroll, pinch zoom, and other nice multitouch features in Windows, but in Julicloud it only functioned as a standard touchpad.  It did have tap to click activated by default, as well as right-side scrolling, which is nice. Jolicloud also supported our video card without any extra work.  The native resolution was already selected, and the only problem we had with the screen was that there was no apparent way to change the brightness.  This is not a major problem, but would be nice to have.  The Samsung N150 has Intel GMA3150 integrated graphics, and Jolicloud promises 1080p HD video on it.  It did playback 720p H.264 video flawlessly without installing anything extra, but it stuttered on full 1080p HD (which is the exact same as this netbook’s video playback in Windows 7 – 720p works great, but it stutters on 1080p).  We would be excited to see full HD on this netbook, but 720p is definitely fine for most stuff.   Jolicloud supports a wide range of netbooks, and based on our experience we would expect it to work as good on any supported hardware.  Check out the list of supported netbooks to see if your netbook is supported; if not, it still may work but you may have to install special drivers. Jolicloud’s performance was very similar to Windows 7 on our netbook.  It boots in about 30 seconds, and apps load fairly quickly.  In general, we couldn’t tell much difference in performance between Jolicloud and Windows 7, though this isn’t a problem since Windows 7 runs great on the current generation of netbooks. Using Jolicloud Ready to start putting Jolicloud to use?  Your fresh Jolicloud install you can run several built-in apps, such as Firefox, a calculator, and the chat client Pidgin.  It also has a media player and file viewer installed, so you can play MP3s or MPG videos, or read PDF ebooks without installing anything extra.  It also has Flash player installed so you can watch videos online easily. You can also directly access all of your files from the right side of your home screen.  You can even access your Windows files; in our test, the 116.9 GB Media was C: from Windows.  Select it to browse and open any file you had saved in Windows. You may need to enter your password to access it. Once you’re authenticated it, you’ll see all of your Windows files and folders.  Your User files (Documents, Music, Videos, etc.) will be in the Users folder. And, you can easily add files from removable media such as USB flash drives and memory cards.  Jolicloud recognized a flash drive we tested with no trouble at all. Add new apps But, the best part about Jolicloud is that it makes it very easy to install new apps.  Click the Get Started button on your homescreen. You’ll first need to create an account.  You can then use this same account on another netbook if you wish, and your settings will automatically be synced between the two. You can either signup using your Facebook account, …or you can sign up the traditional way with your email address, name, and password.  If you sign up this way, you will need to confirm your email address before your account will be finished. Now, choose your netbook model from the list, and enter a name for your computer. And that’s it!  You’ll now see the Jolicloud dashboard, which will show you updates and notifications from friends who also use Jolicloud. Click the App directory to find new apps for your netbook.  Here you will find a variety of webapps, such as Gmail, along with native applications, such as Skype, that you can install on your netbook.  Simply click the Install button on the right to add the app to your netbook. You will be prompted to enter your system password, and then the app will install without any further input.   Once an app is installed, a check mark will appear beside its name.  You can remove it by clicking the Remove button, and it will uninstall seamlessly. Webapps, such as Gmail, actually run in in a Chrome-powered window that lets the webapp run full screen.  This gives the webapps a native feel, but actually they’re just running the same as they would in a standard web browser.   The Jolicloud Interface Most apps run maximized, and there is no way to run them smaller.  This in general works good, since with small screens most apps need to run full-screen anyhow. Smaller apps, such as a calculator or the Pidgin chat client, run in a window just like they do on other operating systems. You can switch to another app that’s running by selecting it’s icon on the top left, or you can go back to the home screen by clicking the home screen.  If you’re finished with an program, simply click the red X button on the top right of the window when you’re running it. Or, you can switch between programs using standard keyboard shortcuts such as Alt-tab. The default page on the home screen is the favorites page, and all of your other programs are orginized in their own sections on the left hand side.  But, if you want to add one of these to your favorites page, simply right-click on it and select Add to Favorites. When you’re done for the day, you can simply close your netbook to put it to sleep.  Or, if you want to shut down, just press the Quit button on the bottom right of the home screen and then select Shut Down. Booting Jolicloud When you install Jolicloud, it will set itself as the default operating system.  Now, when you boot your netbook, it will show you a list of installed operating systems.  You can select either Windows or Jolicloud, but if you don’t make a selection it will boot into Jolicloud after waiting 10 seconds. If you’d perfer to boot into Windows by default, you can easily change this.  First, boot your netbook in to Windows.  Open the start menu, right-click on the Computer button, and select Properties.   Click the “Advanced system settings” link on the left side. Click the Settings button in the Startup and Recovery section. Now, select Windows as the default operating system, and click Ok.  Your netbook will now boot into Windows by default, but will give you 10 seconds to choose to boot into Jolicloud when you start your computer. Or, if you decided you don’t want Jolicloud, you can easily uninstall it from within Windows. Please note that this will also remove any files you may have saved in Jolicloud, so be sure to copy them to your Windows drive before uninstalling. To uninstall Jolicloud from within Windows, open Control Panel, and select Uninstall a Program. Scroll down to select Jolicloud, and click Uninstall/Change. Click Yes to confirm that you want to uninstall Jolicloud. After a few moments, it will let you know that Jolicloud has been uninstalled.  You’re netbook is now back the same as it was before you installed Jolicloud, with only Windows installed. Closing Whether you’re wanting to replace your current OS on your netbook or would simply like to try out a fresh new Linux version on your netbook, Jolicloud is a great option for you.  We were very impressed by it’s solid hardware support and the ease of installing new apps in Jolicloud.  Rather than simply giving us a standard OS, Jolicloud offers a unique way to use your netbook with native programs and webapps.  And whether you’re an IT pro or are a new computer user, Jolicloud was easy enough to use that anyone can do it.  Give it a try, and let us know what your favorite netbook OS is! Link Download Jolicloud for your netbook Similar Articles Productive Geek Tips How To Change XSplash Themes in Ubuntu 9.10Verify the Integrity of Windows Vista System FilesMonitor Multiple Logs in a Single Shell with MultiTail for LinuxHide Some or All of the GUI Bars in FirefoxAsk the Readers: Do You Use a Laptop, Desktop, or Both? TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Stop In The Name Of Love (Firefox addon) Chitika iPad Labs Gives Live iPad Sale Stats Heaven & Hell Finder Icon Using TrueCrypt to Secure Your Data Quickly Schedule Meetings With NeedtoMeet Share Flickr Photos On Facebook Automatically

    Read the article

  • DualLayout for SharePoint 2010 WCM Quick Start

    - by svdoever
    DualLayout for SharePoint 2010 WCM is a solution to provide you with complete HTML freedom in your SharePoint Server 2010 publishing pages. In this post I provide a quick start guide to get you up and running quickly so you can try it out for yourself. This quick start creates a simple HTML5 site with a page to show-case the basics and the power of DualLayout. We will create the site in its own web application. Normally there are many things you have to do to create a clean start point for your SharePoint 2010 WCM site. All those steps will be provided in later posts. For now we want to give you the minimal set of steps to take to get DualLayout working on your machine. Create an authenticated web application with hostheader cms.html5demo.local on port 80 for the cms side of the site. Click the Create Site Collection link on the Application Created dialog box and create a Site Collection based on the Publishing Portal site template. Before we can click the site link in the Top-Level Site Successfully Created dialog we need to add the new host header cms.html5demo.local to the hosts file. Add the following line to the hosts file: 127.0.0.1        cms.html5demo.local Navigate to the site at http://cms.html5demo.local to see the out-of-the-box example Adventure Works publishing site. Download and add the DualLayout solution package designfactory.duallayout.sps2010.trial.1.2.0.0.wsp to the farm’s solution store: On the Start menu, click All Programs. Click Microsoft SharePoint 2010 Products. Click SharePoint 2010 Management Shell. At the Windows PowerShell command prompt, type the following command:Add-SPSolution -LiteralPath designfactory.duallayout.sps2010.trial.1.2.0.0.wsp In SharePoint 2010 Central Administration deploy the solution to the web application http://cms.html5demo.local. Navigate to the site at http://cms.html5demo.local, and in the Site Settings screen select Site Collection Administration > Site collection features and activate the following feature: Open the site http://cms.html5demo.local in SharePoint Designer 2010. Create a view-mode masterpage html5simple.master with the following code: html5simple.master <%@ Master language="C#" %> <%@ Register Tagprefix="SharePointWebControls" Namespace="Microsoft.SharePoint.WebControls" Assembly="Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register TagPrefix="sdl" Namespace="DesignFactory.DualLayout" Assembly="DesignFactory.DualLayout, Version=1.2.0.0, Culture=neutral, PublicKeyToken=077f92bbf864a536" %>   <!DOCTYPE html> <html class="no-js">       <head>         <meta charset="utf-8" />         <meta http-equiv="X-UA-Compatible" content="IE=Edge" />         <title><SharePointWebControls:FieldValue FieldName="Title" runat="server"/></title>           <script type="text/javascript">             document.createElement('header');             document.createElement('nav');             document.createElement('article');             document.createElement('hgroup');             document.createElement('aside');             document.createElement('section');             document.createElement('footer');             document.createElement('figure');             document.createElement('time');         </script>           <asp:ContentPlaceHolder id="PlaceHolderAdditionalPageHead" runat="server"/>     </head>          <body>                  <header>             <div class="logo">Logo</div>             <h1>SiteTitle</h1>             <nav>                 <a href="#">SiteMenu 1</a>                 <a href="#">SiteMenu 2</a>                 <a href="#">SiteMenu 3</a>                 <a href="#">SiteMenu 4</a>                 <a href="#">SiteMenu 5</a>                 <sdl:SwitchToWcmModeLinkButton runat="server" Text="…"/>             </nav>             <div class="tagline">Tagline</div>             <form>                 <label>Zoek</label>                 <input type="text" placeholder="Voer een zoekterm in...">                 <button>Zoek</button>                             </form>           </header>                  <div class="content">             <div class="pageContent">                 <asp:ContentPlaceHolder id="PlaceHolderMain" runat="server" />             </div>         </div>              <footer>             <nav>                 <ul>                     <li><a href="#">FooterMenu 1</a></li>                     <li><a href="#">FooterMenu 2</a></li>                     <li><a href="#">FooterMenu 3</a></li>                     <li><a href="#">FooterMenu 4</a></li>                     <li><a href="#">FooterMenu 5</a></li>                 </ul>             </nav>             <small>Copyright &copy; 2011 Macaw</small>         </footer>     </body> </html> Note that if no specific WCM-mode master page is specified (html5simple-wcm.master), the default v4.master master page will be used in WCM-mode. Create a WCM-mode page layout html5simplePage-wcm.aspx with the following code: html5simplePage-wcm.aspx <%@ Page language="C#"     Inherits="DesignFactory.DualLayout.WcmModeLayoutPage, DesignFactory.DualLayout, Version=1.2.0.0, Culture=neutral, PublicKeyToken=077f92bbf864a536" %> <%@ Register Tagprefix="SharePointWebControls"              Namespace="Microsoft.SharePoint.WebControls"              Assembly="Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="WebPartPages"              Namespace="Microsoft.SharePoint.WebPartPages"              Assembly="Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="PublishingWebControls"              Namespace="Microsoft.SharePoint.Publishing.WebControls"              Assembly="Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="PublishingNavigation" Namespace="Microsoft.SharePoint.Publishing.Navigation"              Assembly="Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <asp:Content ContentPlaceholderID="PlaceHolderPageTitle" runat="server">     <SharePointWebControls:FieldValue id="PageTitle" FieldName="Title" runat="server"/> </asp:Content> <asp:Content ContentPlaceholderID="PlaceHolderMain" runat="server"> </asp:Content> Notice the Inherits at line two. Instead of inheriting from Microsoft.SharePoint.Publishing.PublishingLayoutPage we need to inherit from DesignFactory.DualLayout.WcmModeLayoutPage. Create a view-mode page layout html5simplePage.aspx with the following code: html5simplePage.aspx html5simplePage.aspx <%@ Page language="C#"          Inherits="DesignFactory.DualLayout.ViewModeLayoutPage, DesignFactory.DualLayout,                     Version=1.2.0.0, Culture=neutral, PublicKeyToken=077f92bbf864a536" %> <%@ Register Tagprefix="SharePointWebControls"              Namespace="Microsoft.SharePoint.WebControls"              Assembly="Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="WebPartPages"              Namespace="Microsoft.SharePoint.WebPartPages"              Assembly="Microsoft.SharePoint, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="PublishingWebControls"              Namespace="Microsoft.SharePoint.Publishing.WebControls"              Assembly="Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <%@ Register Tagprefix="PublishingNavigation" Namespace="Microsoft.SharePoint.Publishing.Navigation"              Assembly="Microsoft.SharePoint.Publishing, Version=14.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c" %> <asp:Content ContentPlaceholderID="PlaceHolderAdditionalPageHead" runat="server" /> <asp:Content ContentPlaceholderID="PlaceHolderMain" runat="server">     The title of the page is: <SharePointWebControls:FieldValue id="PageTitleInContent" FieldName="Title" runat="server"/> </asp:Content> Notice the Inherits at line two. Instead of inheriting from Microsoft.SharePoint.Publishing.PublishingLayoutPage we need to inherit from DesignFactory.DualLayout.ViewModeLayoutPage. Set the html5simple.master master page as the Site Master Page Set the allowed page layouts to the Html5 Simple Page page layout and set the New Page Default Settings also to Html5 Simple Page so new created pages are also of this page layout. Note that the Html5 Simple Page page layout is initially not selectable for New Page Default Settings. Save this configuration page first after selecting the allowed page layouts, then open again and select the default new page. Under Site Actions select the New Page action. Create a page Home.aspx of the default page layout type Html5 Simple Page. Set the new created Home.aspx page as Welcome Page. Navigate to the site http://csm.html5demo.local and see the home page in the WCM display and edit mode. Select Switch to View Mode under Site Actions to see the resulting page in view-mode. Select the three dots (…) at the right side of the menu to switch back to WCM-mode. Have a look at the source view of the resulting web page and admire the clean HTML. No SharePoint specific markup or CSS files! Clean HTML in page <!DOCTYPE html> <html class="no-js">     <head>         <meta charset="utf-8" />         <meta http-equiv="X-UA-Compatible" content="IE=Edge" />         <title>Home</title>         <script type="text/javascript">             document.createElement('header');             document.createElement('nav');             document.createElement('article');             document.createElement('hgroup');             document.createElement('aside');             document.createElement('section');             document.createElement('footer');             document.createElement('figure');             document.createElement('time');         </script>              </head>          <body>                  <header>             <div class="logo">Logo</div>             <h1>SiteTitle</h1>             <nav>                 <a href="#">SiteMenu 1</a>                 <a href="#">SiteMenu 2</a>                 <a href="#">SiteMenu 3</a>                 <a href="#">SiteMenu 4</a>                 <a href="#">SiteMenu 5</a>                 <a href="/Pages/Home.aspx?DualLayout_ShowInWcmMode=true">…</a>             </nav>             <div class="tagline">Tagline</div>             <form>                 <label>Zoek</label>                 <input type="text" placeholder="Voer een zoekterm in...">                 <button>Zoek</button>                             </form>         </header>                  <div class="content">             <div class="pageContent">                      The title of the page is: Home             </div>         </div>              <footer>             <nav>                 <ul>                     <li><a href="#">FooterMenu 1</a></li>                     <li><a href="#">FooterMenu 2</a></li>                     <li><a href="#">FooterMenu 3</a></li>                     <li><a href="#">FooterMenu 4</a></li>                     <li><a href="#">FooterMenu 5</a></li>                 </ul>             </nav>             <small>Copyright &copy; 2011 Macaw</small>         </footer>     </body> </html> <!-- Macaw DesignFactory DualLayout for SharePoint 2010 Trial version --> Note the link at line 37, this link will only be rendered for authenticated users and is our way to switch back to WCM-mode. This concludes our quick start to get DualLayout up an running in a matter of minutes. And what is the result: You can have the full SharePoint 2010 WCM publishing page editing experience to manage the content in your pages. You don’t have to delve into large SharePoint specific master pages and page layouts with a lot of knowledge of the does and don'ts with respect to SharePoint controls, scripts and stylesheets. The end-user gets a clean and light HTML page. Get your fully functional, non-timebombed trial copy of DualLayout and start creating!

    Read the article

  • What is the best SOHO NAS currently available?

    - by VinceJS
    What is the "best" Small Office Home Office (SOHO) Network Attached Storage (NAS) device available? Best performance vs. cost that is! I am looking for one that I can use at home to safely store my pictures, videos. What features should I look for? There are so many NAS reviews on the web, how do you choose the right one?

    Read the article

  • What is the best SOHO NAS currently available?

    - by VinceJS
    What is the "best" Small Office Home Office (SOHO) Network Attached Storage (NAS) device available? Best performance vs. cost that is! I am looking for one that I can use at home to safely store my pictures, videos. What features should I look for? There are so many NAS reviews on the web, how do you choose the right one?

    Read the article

< Previous Page | 116 117 118 119 120 121 122 123 124 125 126 127  | Next Page >