Search Results

Search found 47251 results on 1891 pages for 'web storage'.

Page 845/1891 | < Previous Page | 841 842 843 844 845 846 847 848 849 850 851 852  | Next Page >

  • Database Insider - October 2012 issue

    - by Javier Puerta
    The October issue of the Database Insider newsletter is now available. (Full newsletter here) NEWS   Newly Launched Oracle Exadata X3 Redefines Extreme Performance At Oracle OpenWorld 2012, Oracle announced the general availability of Oracle Exadata Database Machine X3, a complete package of servers, storage, networking, and software that is massively scalable, secure, and fully redundant—and ideally suited for the varied and unpredictable workloads of cloud computing. Read More WEBCASTS What Are Oracle Users Doing to Improve Availability and Disaster Recovery? The Independent Oracle Users Group (IOUG) surveyed more than 350 data managers and professionals regarding planned and unplanned downtime, database high availability, and disaster recovery solutions. Download the report and watch the Webcast today.

    Read the article

  • How to access the SD card through my Sony Ericsson Xperia Arc Phone?

    - by user16364
    I have a Sony Ericsson Xperia Arc phone and I cannot access its SD card through the USB cable provided. I have set the USB connection mode to MSC yet when I connect the phone to my computer I cannot see the SD card (or anything for that matter). The Disk Utility however, does see an SEMC Mass Storage device but it says that no media was detected. I have validated that the SD card works as I removed it from the camera and plugged it into a card reader and saw all the photos and files stored on it. I have validated that the phone works as I have connected it (in MSC mode) to my wife's Windows 7 computer. Can anyone please tell me how I can access the SD card on my phone.

    Read the article

  • Settings object with singleton pattern

    - by axis
    I need to build an object that will have only one instance because this Object is dedicated to the storage of vital settings for my application and I would like to avoid a misuse of this type or a conflict at run-time. The most popular solution for this, according to the internet, is the Singleton pattern. But I would like to know about other ideas or solutions for this; also I would like to know if other solutions can be much more easy to grasp for an user of this hypothetical library. Thanks.

    Read the article

  • Is the php method md5() secure? Can it be used for passwords? [migrated]

    - by awiebe
    So executing a php script causes the form values to be sent to the server, and then they are processed. If you want to store a password in your db than you want it to be a cryptographic hash(so your client side is secure, can you generate an md5 using php securely( without submitting the user:password pair in the clear), or is there an alternative standard method of doing this, without having the unecrypted pasword leaving the clients machine? Sorry if this is a stupid question I'm kind of new at this. I think this can be done somehow using https, and on that note if a site's login page does not use https, does that mean that while the databse storage is secure, the transportation is not?

    Read the article

  • Amazon sort le SDK AWS pour PHP 2, une version récrite entièrement à partir de PHP 5.3 pour optimiser l'accès à ses services Cloud

    Amazon sort le SDK AWS pour PHP 2 une version entièrement récrite à partir de PHP 5.3 pour optimiser l'accès à ses services Cloud Amazon vient de publier la nouvelle version du SDK AWS (Amazon Web Service) pour PHP. Le SDK AWS pour PHP permet aux développeurs utilisant le langage de créer des applications pouvant exploiter les services de la plateforme Cloud dont DynamoDB, Amazon Simple Storage Service (Amazon S3), Amazon Glacier et Amazon CloudFront. Le nouveau SDK AWS a été entièrement reconstruit à partir de zéro, pour tirer pleinement parti de PHP 5.3 et prendre en compte les recommandations de PHP Framework Interop Group's.

    Read the article

  • How to access the SD card through my Sony Ericsson Xperia Arc Phone?

    - by user16364
    I have a Sony Ericsson Xperia Arc phone and I cannot access its SD card through the USB cable provided. I have set the USB connection mode to MSC yet when I connect the phone to my computer I cannot see the SD card (or anything for that matter). The Disk Utility however, does see an SEMC Mass Storage device but it says that no media was detected. I have validated that the SD card works as I removed it from the camera and plugged it into a card reader and saw all the photos and files stored on it. I have validated that the phone works as I have connected it (in MSC mode) to my wife's Windows 7 computer. Can anyone please tell me how I can access the SD card on my phone.

    Read the article

  • DropVox Records Voice Memos Right to Your Dropbox Account

    - by Jason Fitzpatrick
    DropVox is a clever and highly specialized application that, quite effectively, turns your iOS device into a voice recorder with Dropbox-based storage. Install the app, launch it, hit the record button, and your recording is uploaded to your Dropbox account in .m4a format as soon as you’re finished creating it. You can also configure DropVox to start recording immediately after launch and to continue recording if the device is locked or other applications are in use. Hit up the link to grab a copy. DropVox is currently $0.99 (50% off for a limited time) and works on the iPhone, iPad, and iPod Touch with microphone attached. DropVox [via Download Squad] HTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To KnowHTG Explains: How Does Email Work?

    Read the article

  • New .Net Authentication in 4.5.1

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/11/05/new-.net-authentication-in-4.5.1.aspxThere has been a lot of traffic on my post about Simple Membership that came with the File new Project MVC 4 in 2012. I was reading the release notes for Visual Studio 2013 and .Net 4.5.1 and it mentioned a new/updated Authentication approach. “ASP.NET Identity is the new membership system for ASP.NET applications. ASP.NET Identity makes it easy to integrate user-specific profile data with application data. ASP.NET Identity also allows you to choose the persistence model for user profiles in your application. You can store the data in a SQL Server database or another data store, including NoSQL data stores such as Windows Azure Storage Tables” There’s a great page on the asp.net site that gives an introduction, overview, how to use it, and how to migrate to it. I won’t be doing a new project for awhile at work, but I’ll definitely be looking into this more when I get the time.

    Read the article

  • Microsoft Releases Windows Server 2012

    Windows Server 2012 offers expanded virtualization capabilities and works with Windows Azure, the software company's cloud platform. It can deliver more than 200 public, private and hybrid cloud services. The goal seems to be to deliver any application on any cloud. Rand Morimoto, president of Microsoft partner Convergent Computing, sees a number of major selling points for Windows Server 2012. For instance, its deduplication features can save a company 40 to 50 percent in storage space. Its automated IT management capabilities enable it to manage a large number of virtual machines. It can eve...

    Read the article

  • The Advancement of Technology That Lead to Websites Being One of the Most Important Business Assets

    Twenty years ago the world was a very different place. Most companies were still using paper based filing systems and people saw computers as being complicated and expensive. Businesses had storage rooms and large filing cabinets full of alphabetically and chronologically ordered documents and letters. Due to the efforts of large corporations, technology has advanced in a way that most people would have never imagined. What would have taken up a full warehouse worth of space can now be stored digitally in a device that is smaller than a book and it can be searched through in a matter of seconds.

    Read the article

  • Synchronize Azure SQL (cloud) with Azure SQL Emulator (local)?

    - by Sid
    We have an Azure service (web role) that heavily depends on the database. For offline development/testing, we'd like to have the app+db run offline within the emulators. Running the webrole itself within the emulator is straightforward but doing so for the Azure SQL storage isn't so. What is the simplest way to ensure that the cloud Azure SQL database and the emulator/local Azure SQL database are in sync? We can afford some level of staleness for simplicity of sync operation (meaning it's ok for the local copy to be a few hours stale versus mirroring every write as soon as it happens) Thanks

    Read the article

  • resizing partitions

    - by venetin
    I have the following configuration: sda1 1 GB maybe fat32 (windows recovery partition) sda2 40 GB ntfs(windows drive c) with boot flag sda3 around 100GB ntfs(storage partition) sda4 extended partition:sda5 10 GB ext4 partition sda6 1 GB linux swap I want to make this changes: sda2 30 GB resize(decrease size with 10 GB) sda3 around 100GB(move and maybe decrease size with 4-5 GB) sda4 around 20-22 GB (move and increase size with 10-15GB) sda5 around 20 GB (move and increase size with 10-12 GB) sda6 2 GB (move and increase size with 1 GB) Is it safe to do this operations?Will i lose grub? I will do the changes with gparted on puppy linux live usb. Thanks

    Read the article

  • Entity System and rendering

    - by hayer
    Okey, what I know so far; The entity contains a component(data-storage) which holds information like; - Texture/sprite - Shader - etc And then I have a renderer system which draws all this. But what I don't understand is how the renderer should be designed. Should I have one component for each "visual type". One component without shader, one with shader, etc? Just need some input on whats the "correct way" to do this. Tips and pitfalls to watch out for.

    Read the article

  • Does home directory encryption depend on gnome keyring?

    - by pedorro
    My gnome-keyring has somehow gotten messed up. It prompts for a password (that I know I never provided - yes I chose 'unsafe storage'). None of the possible passwords that I use (including empty) are working. So basically I want to delete the default key so I can start over. I just want to confirm that this isn't somehow tied to my home directory encryption. I want to be sure that if I delete the default key from it, I will still be able to log in normally and decrypt my home directory. It seems likely that they're unrelated as the keyring is within the home directory and is thus itself encrypted, but I just thought I'd ask. Anyone have any thoughts?

    Read the article

  • I need to get past my permissions to recover data

    - by adsmz
    Due to some mishaps, I am unable to boot into Kubuntu at all. However, my data is still on the hard drive. I managed to get one of the other two computers to which I have access to read the disk by booting into a liveCD session of kubuntu. The only storage medium to which I have access is a 30 GB data stick. Here's where the trouble starts: In music alone, I have to back up about 60 GB. Obviously this is going to have to be split into chunks and moved over to the second spare PC until I can reinstall Kubuntu on my laptop. All of the data that needs backed up is behind a permissions wall, so while I can view it, I can't interact with it directly. I know copying and moving through the terminal can get around this with sudo cp or sudo mv, but is there a way to first compress multiple folders in a single archive, then move it? (While we're on the subject, what compression method would be best for large volumes of music in MP3, WAV, and OGG format?)

    Read the article

  • 'outlier': I/O ???

    - by katsumii
    ??? outlier ???????????????????????????????? - Wikipedia???(????)????????????????????????????????????????????????????????????????outlier site:docs.oracle.com - Google SearchOutlier Update Percent (MRP and Supply Chain Planning Help) Oracle Demantra Implementation Guide OraSVMClassificationSettings (Oracle Data Mining Java API ... Defining a Forecast Set (MRP and Supply Chain Planning Help)????????????????????? I/O ???????????? ????????? 'Exadata' ? 'outlier' ???????????????????????????????Guy Harrison - Yet Another Database Blog - Exadata Smart Flash Logging–Outliersflash log feature was effective in eliminating or at least reducing very extreme outlying redo log sync times.????????? Solaris 11.1 ?? I/O ??????????????????????Oracle Announces Availability of Oracle Solaris 11.1 and Oracle Solaris Cluster 4.1Oracle Solaris 11.1 exposes OracleSolaris DTraceI/O interfaces that allow an Oracle Database administrator to identify I/O outliers and subsequently isolate network or storage bottlenecks.

    Read the article

  • SLOB: ?????????????

    - by katsumii
    Oracle DB????????????????????????????Introducing SLOB – The Silly Little Oracle Benchmark « Kevin Closson's Blog: Platforms, Databases and StorageSLOB supports testing Oracle logical read (SGA buffer gets) scalingSLOB supports testing physical random single-block reads (db file sequential read)SLOB supports testing random single block writes (DBWR flushing capacity)SLOB supports testing extreme REDO logging I/O????????????????Oracle?????????Swingbench ??????????IPC Semaphore?????C???????????????????Windows???????????Cygwin??????????????????????????????SwingbenchSwingbench can be used to demonstrate and test technologies such as Real Application Clusters, Online table rebuilds, Standby databases, Online backup and recovery etc.???????I/O?????????????????Oracle ORION DownloadsORION (Oracle I/O Calibration Tool) is a standalone tool for calibrating the I/O performance for storage systemsSLOB ??????????????????????????? 

    Read the article

  • SQL*Plus??? - ??????????????(????? ???Tips-2)

    - by Yuichi.Hayashi
    script??????????????????????????SQL*Plus???????????????????SQL*Plus????????????????????????? ????????????????SQL*Plus???????????????????? SQL*Plus?-s????????????????????????????? ??????????????????????????????????? <-s??????????> $ sqlplus scott/tiger SQL*Plus: Release 11.2.0.1.0 Production on ? 12? 22 17:14:14 2010 Copyright (c) 1982, 2009, Oracle. All rights reserved. Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production With the Partitioning, Real Application Clusters, Automatic Storage Management, OLAP, Data Mining and Real Application Testing options ????????? SQL <-s???????????> $ sqlplus -s scott/tiger select sysdate from dual; SYSDATE -------- 10-12-22 exit $ (Written by Hiroyuki Nakaie)

    Read the article

  • ????Oracle EBS R12 on Exadata V2 ,MMA and Hight Performance

    - by longchun.zhu
    ???????? ?????,???, ????????hands-on ??? ??????: 1. Oracle EBS R12 on Database Machine MAA & Performance Architecture. 2. Oracle EBS R12 Single Instance Node Deployment Procedure. 2.Start Rapid Install Wizard. 3. Oracle EBS R12 Single Instance Node Chinese Patch Update. 4.Applying Patches. 5.Upgrade Application Database Version to 11g Release 2. 6.Database Upgrade 7. Deploy Clone Application Database to Sun Oracle Database Machine. 8.Migrate Application Database File System to Exadata ASM Storage. 9.Covert Application Database Single Instance to RAC.. 10.Configure High Availability & High Performance Architecture with Exadata.

    Read the article

  • ???????/???Oracle ASM?Oracle Clusterware???????????

    - by user788995
    ????? ??:2012/01/23 ??:??????/?? Oracle Database 11g Release 2 ? Oracle Clusterware ? Oracle Automatic Storage Management(ASM) ?????????????????????????? RAC·ASM·Clusterware ?????????? Oracle Clusterware ? Oracle ASM??? / Oracle Clusterware ???????????Oracle ASM ?????????????????? / Oracle Clusterware ??????????Oracle ASM ??????????????? ????????? ????????????????? http://otndnld.oracle.co.jp/ondemand/otn-seminar/movie/111202_C-14_ASMClusterware.wmv http://otndnld.oracle.co.jp/ondemand/otn-seminar/movie/mp4/111202_C-14_ASMClusterware.mp4 http://www.oracle.com/technetwork/jp/ondemand/db-technique/c-14-asmclusterware-1448430-ja.pdf

    Read the article

  • ???10.2???+ASM???11.2???+ASM(Oracle Restart)

    - by JaneZhang(???)
        ?????????????10.2.0.5?????ASM??????????11.2.0.2???ASM???????(??Oracle Restart).    OS: Linux 5.3    ????:    1.???10.2????????11.2.0.2?OUI,?????“"Upgrade Oracle Grid Infrastructure of Oracle Automatic Storage Management”?????ASM???????,???????????   2. ?11.2.0.2 ????????????ORACLE_HOME?   3. ??DBUA?10.2.0.5???????11.2.0.2.    ?????????:How_to_Upgrade_10.2_Single_ASM_To_11.2_Single_ASM.pdf

    Read the article

  • ??????????? Oracle OpenWorld Tokyo 2012 ?????? ~?????????????~ ????????!

    - by M.Morozumi
    Oracle OpenWorld Tokyo 2012?JavaOne Tokyo 2012 ???????????????????????????Oracle Open World ?????????????????2????????? Exadata ???????·DBA·????????? Exadata?????????(???)???Oracle Exadata Database Machine ??? Exadata Storage Server X2-2 ??????????Exadata ???????????????????????????????????Database Machine?????????????????????????????????????????????????????????? ?????????(???) 2012?4?2?~4?3? Oracle BI Suite EE 10g??????! 11g????????????????????!! Oracle BIEE 10g???? 11g Report/Dashboard ?????Oracle BI Suite EE 10g?????????11g???(????)????????????????????????? Oracle BIEE 10g???? 11g Report/Dashboard ?? 2012?4?3?~4?4?

    Read the article

  • Implementing Release Notes in TFS Team Build 2010

    - by Jakob Ehn
    In TFS Team Build (all versions), each build is associated with changesets and work items. To determine which changesets that should be associated with the current build, Team Build finds the label of the “Last Good Build” an then aggregates all changesets up unitl the label for the current build. Basically this means that if your build is failing, every changeset that is checked in will be accumulated in this list until the build is successful. All well, but there uis a dimension missing here, regarding to releases. Often you can run several release builds until you actually deploy the result of the build to a test or production system. When you do this, wouldn’t it be nice to be able to send the customer a nice release note that contain all work items and changeset since the previously deployed version? At our company, we have developed a Release Repository, which basically is a siple web site with a SQL database as storage. Every time we run a Release Build, the resulting installers, zip-files, sql scripts etc, gets pushed into the release repositor together with the relevant build information. This information contains things such as start time, who triggered the build etc. Also, it contains the associated changesets and work items. When deploying the MSI’s for a new version, we mark the build as Deployed in the release repository. The depoyed status is stored in the release repository database, but it could also have been implemented by setting the Build Quality for that build to Deployed. When generating the release notes, the web site simple runs through each release build back to the previous build that was marked as Deplyed, and aggregates the work items and changesets: Here is a sample screenshot on how this looks for a sample build/application The web site is available both for us and also for the customers and testers, which means that they can easily get the latest version of a particular application and at the same time see what changes are included in this version. There is a lot going on in the Release Build Process that drives this in our TFS 2010 server, but in this post I will show how you can access and read the changeset and work item information in a custom activity. Since Team Build associates changesets and work items for each build, this information is (partially) available inside the build process template. The Associate Changesets and Work Items for non-Shelveset Builds activity (located inside the Try  Compile, Test, and Associate Changesets and Work Items activity) defines and populates a variable called associatedWorkItems   You can see that this variable is an IList containing instances of the Changeset class (from the Microsoft.TeamFoundation.VersionControl.Client namespace). Now, if you want to access this variable later on in the build process template, you need to declare a new variable in the corresponding scope and the assign the value to this variable. In this sample, I declared a variable called assocChangesets in the RunAgent sequence, which basically covers the whol compile, test and drop part of the build process:   Now, you need to assign the value from the AssociatedChangesets to this variable. This is done using the Assign workflow activity:   Now you can add a custom activity any where inside the RunAgent sequence and use this variable. NB: Of course your activity must place somewhere after the variable has been poplated. To finish off, here is code snippet that shows how you can read the changeset and work item information from the variable.   First you add an InArgumet on your activity where you can pass i the variable that we defined. [RequiredArgument] public InArgument<IList<Changeset>> AssociatedChangesets { get; set; } Then you can traverse all the changesets in the list, and for each changeset use the WorkItems property to get the work items that were associated in that changeset: foreach (Changeset ch in associatedChangesets) { // Add change theChangesets.Add( new AssociatedChangeset(ch.ChangesetId, ch.ArtifactUri, ch.Committer, ch.Comment, ch.ChangesetId)); foreach (var wi in ch.WorkItems) { theWorkItems.Add( new AssociatedWorkItem(wi["System.AssignedTo"].ToString(), wi.Id, wi["System.State"].ToString(), wi.Title, wi.Type.Name, wi.Id, wi.Uri)); } } NB: AssociatedChangeset and AssociatedWorkItem are custom classes that we use internally for storing this information that is eventually pushed to the release repository.

    Read the article

  • SharePoint.DesignFactory.ContentFiles–building WCM sites

    - by svdoever
    One of the use cases where we use the SharePoint.DesignFactory.ContentFiles tooling is in building SharePoint Publishing (WCM) solutions for SharePoint 2007, SharePoint 2010 and Office365. Publishing solutions are often solutions that have one instance, the publishing site (possibly with subsites), that in most cases need to go through DTAP. If you dissect a publishing site, in most case you have the following findings: The publishing site spans a site collection The branding of the site is specified in the root site, because: Master pages live in the root site (/_catalogs/masterpage) Page layouts live in the root site (/_catalogs/masterpage) The style library lives in the root site ( /Style Library) and contains images, css, javascript, xslt transformations for your CQWP’s, … Preconfigured web parts live in the root site (/_catalogs/wp) The root site and subsites contains a document library called Pages (or your language-specific version of it) containing publishing pages using the page layouts and master pages The site collection contains content types, fields and lists When using the SharePoint.DesignFactory.ContentFiles tooling it is very easy to create, test, package and deploy the artifacts that can be uploaded to the SharePoint content database. This can be done in a fast and simple way without the need to create and deploy WSP packages. If we look at the above list of artifacts we can use SharePoint.DesignFactory.ContentFiles for master pages, page layouts, the style library, web part configurations, and initial publishing pages (these are normally made through the SharePoint web UI). Some artifacts like content types, fields and lists in the above list can NOT be handled by SharePoint.DesignFactory.ContentFiles, because they can’t be uploaded to the SharePoint content database. The good thing is that these artifacts are the artifacts that don’t change that much in the development of a SharePoint Publishing solution. There are however multiple ways to create these artifacts: Use paper script: create them manually in each of the environments based on documentation Automate the creation of the artifacts using (PowerShell) script Develop a WSP package to create these artifacts I’m not a big fan of the third option (see my blog post Thoughts on building deployable and updatable SharePoint solutions). It is a lot of work to create content types, fields and list definitions using all kind of XML files, and it is not allowed to modify these artifacts when in use. I know… SharePoint 2010 has some content type upgrade possibilities, but I think it is just too cumbersome. The first option has the problem that content types and fields get ID’s, and that these ID’s must be used by the metadata on for example page layouts. No problem for SharePoint.DesignFactory.ContentFiles, because it supports deploy-time resolving of these ID’s using PowerShell. For example consider the following metadata definition for the page layout contactpage-wcm.aspx.properties.ps1: Metadata page layout # This script must return a hashtable @{ name=value; ... } of field name-value pairs # for the content file that this script applies to. # On deployment to SharePoint, these values are written as fields in the corresponding list item (if any) # Note that fields must exist; they can be updated but not created or deleted. # This script is called right after the file is deployed to SharePoint.   # You can use the script parameters and arbitrary PowerShell code to interact with SharePoint. # e.g. to calculate properties and values at deployment time.   param([string]$SourcePath, [string]$RelativeUrl, $Context) @{     "ContentTypeId" = $Context.GetContentTypeID('GeneralPage');     "MasterPageDescription" = "Cloud Aviator Contact pagelayout (wcm - don't use)";     "PublishingHidden" = "1";     "PublishingAssociatedContentType" = $Context.GetAssociatedContentTypeInfo('GeneralPage') } The PowerShell functions GetContentTypeID and GetAssociatedContentTypeInfo can at deploy-time resolve the required information from the server we are deploying to. I personally prefer the second option: automate creation through PowerShell, because there are PowerShell scripts available to export content types and fields. An example project structure for a typical SharePoint WCM site looks like: Note that this project uses DualLayout. So if you build Publishing sites using SharePoint, checkout out the completely free SharePoint.DesignFactory.ContentFiles tooling and start flying!

    Read the article

  • Is HR/Recruitment Really Ready For Innovative Candidates

    - by david.talamelli
    Before I begin this blog post, I want to acknowledge that there are some great HR/Recruitment people out there who are innovative and are leading the way in using new means to successfully attract and connect with talented people. For those of you who fit in this category, please keep thinking outside the square - just because what you do may not be the norm doesn't mean it is bad. Ok, with that acknowledgment out of the way - Earlier this morning (I started this post Friday morning) I came across this online profile via a tweet from Philip Tusing I love the information that Jason has put on his web-pages. From his work Jason clearly demonstrates not only his skills/experience but also I love how he relates his experience and shows how it will help an employer and what the value add of having him on your team is. Looking at Jason's profile makes me think though, is HR/Recruitment in general terms ready to deal with innovative candidates. Sure most Recruiters are online in some form or another, but how many actually have a process that is flexible enough to deal with someone who may not fit into your processes. Is your company's recruitment practice proactive enough to find Jason's web-pages? I am not sure what he is doing in terms of a job search, but if he is not mailing a resume or replying to ads on a Job Board - hopefully Jason comes up on some of the candidate searching you are doing. Once you find this information, would the information Jason provides fit nicely into your Applicant Tracking System or your Database? If not, how much of the intangible information are you losing and potentially not passing on to a Hiring Manager. I think what has worked in the past will not necessarily work in the future. Candidates want to work somewhere they will be challenged and learn and grow. If your HR/Recruitment team displays processes that take don't necessarily convey this message, this potentially could turn people away who were once interested in your company. For example (and I have to admit I still do some of these things myself), once calling up and having a talk to a candidate a company may say: 1) HR Question: Send me in a copy of your resume - Candidate Reply - you actually already have my resume, the web-page is http:// 2) HR Question:Come in for a chat so we can get to know you - Candidate Reply - if this is the basis of a meeting, you already know me and my thoughts by looking at my online links (blog, portfolio, homepage, etc...) These questions if not handled properly could potentially turn a candidate from being interested in your company to not being interested in your company. It potentially could demonstrate that your company is not social media savvy or maybe give the impression of not really being all that innovative. A candidate may think, if this company isn't able to take information I have provided in the public forum and use it, is it really a company I want to work for? I think when liaising with candidates a company should utilise the information the person has provided in the public domain. A candidate may inadvertantly give you answers to many of the questions you are seeking on their online presence and save everyone time instead of having to fill out forms or paperwork. If you build this into your conversations with your candidates it becomes a much more individualised service you are providing and really demonstrates to a candidate you are thinking of them as an individual. Yes I know we need to have processes in place and I am not saying don't work to those processes, but don't let process take away a candidates individuality. Don't let your process inadvertently scare away the top candidates that you may want in your company. This article was originally posted on David Talamelli's Blog - David's Journal on Tap

    Read the article

< Previous Page | 841 842 843 844 845 846 847 848 849 850 851 852  | Next Page >