Search Results

Search found 19074 results on 763 pages for 'secure government government cloud security'.

Page 77/763 | < Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >

  • Are there cloud network drives that let users lock files or mark them as "in use"?

    - by Brandon Craig Rhodes
    Having spent several hours reading about the features and limitations of services like DropBox and Jungle Disk and the hundreds of competitors they seem to have (as though everyone with an AWS account these days goes ahead and writes a file sharing application just for fun), I have yet to find one that would let a team of people at a small business collaborate without stepping all over each other's toes. At a small business there are often many small documents per project — estimates, contracts, project plans, budgets — and team members frequently have to open and edit them, with all sorts of problems happening if two people edit a file at once. Even if a sharing service is smart enough to keep both versions of the file created, most small-business software (like word processors, spreadsheets, estimating software, or billing systems) has no way to compare — much less to merge! — the changes in two rival versions of a file that two people edited at the same time without each other's knowledge. So, my question: are their cloud-based file sharing solutions that not only provide a virtual network drive that people can access, but that also let users lock files — even if it's not a real lock but just a flag or indicator — that could possibly prevent remote workers from both editing the same file at once? Having one person wait for another person to finish editing is a very, very small inconvenience compared to the hour or more than it can take to compare two estimates by hand until you find and resolve the rival changes. Given this fact, I am surprised that almost none of the popular file sharing solutions seem to recognize this problem and provide some solution! Does anyone know of a service that does?

    Read the article

  • Are there cloud network drives that let users lock files or mark them as "in use"?

    - by Brandon Craig Rhodes
    Having spent several hours reading about the features and limitations of services like DropBox and Jungle Disk and the hundreds of competitors they seem to have (as though everyone with an AWS account these days goes ahead and writes a file sharing application just for fun), I have yet to find one that would let a team of people at a small business collaborate without stepping all over each other's toes. At a small business there are often many small documents per project — estimates, contracts, project plans, budgets — and team members frequently have to open and edit them, with all sorts of problems happening if two people edit a file at once. Even if a sharing service is smart enough to keep both versions of the file created, most small-business software (like word processors, spreadsheets, estimating software, or billing systems) has no way to compare — much less to merge! — the changes in two rival versions of a file that two people edited at the same time without each other's knowledge. So, my question: are their cloud-based file sharing solutions that not only provide a virtual network drive that people can access, but that also let users lock files — even if it's not a real lock but just a flag or indicator — that could possibly prevent remote workers from both editing the same file at once? Having one person wait for another person to finish editing is a very, very small inconvenience compared to the hour or more than it can take to compare two estimates by hand until you find and resolve the rival changes. Given this fact, I am surprised that almost none of the popular file sharing solutions seem to recognize this problem and provide some solution! Does anyone know of a service that does?

    Read the article

  • Does a VPN requirement kill the concept of having a Web Application in the Cloud?

    - by Christian
    Recently I posted a question in SO, but so far I got no answers. I wonder if I'm asking the wrong question. This is the problem: We need to design an application which offers a public http web service, but at the same time it must consume some services through a VPN connection from other existing company. There is no other alternative but to use a VPN connection to access those services. We want to host our application in some cloud infrastructure like Heroku or Amazon EC2. But there is no direct way to access the VPN services of the other company from there. The solution I'm thinking, but I don't like is to have a different server to expose the services from that VPN. But this will require the setup of another server which I prefer to avoid. In the case this is the solution, can I use an Amazon EC2 instance to connect to a VPN? This is what I was thinking, is it correct? I don't have experience using VPNs, tunnels or those kind of networking stuff. I will really appreciate if you can propose me an alternative solution, or just give me a comment.

    Read the article

  • Migrating to AWS Cloud with auto-scaling - where to put Redis and ElasticSearch?

    - by RobMasters
    I've been trying to research this topic but haven't found anywhere that recommends where to install services such as Redis and ElasticSearch when migrating to a cloud framework. I'm currently running a Symfony2 application on 2 static servers - one is running MySQL and the other is the public facing web server, which also has Redis and ElasticSearch running on it. Both of these servers are virtualised, but they're static in terms of not being able to replicate at present (various aspects are still dependent on the local filesystem). The goal is to migrate to AWS and use auto-scaling to be able to spin up and kill web servers as required, but I'm not clear on what I should put on each EC2 instance. Should they be single-responsibility only? i.e. Set up individual instances for the web server(s), Redis, and ElasticSearch and most likely an RDS instance for MySQL and only set up auto-scaling on the web server(s)? I don't foresee having to scale the ElasticSearch server anytime soon as it's only driving the search functionality, but it's possible that Redis may need to be replicated at some point - but should this be done manually? I'm not sure of how this could be done automatically as each instance needs to be configured to know about it's master/slave(s) as far as I know. I'd appreciate advice on this. One more quick question while I'm here - how would I be able to deploy code changes when there are X web servers currently active? I'm using a Capifony deployment script (Symfony2 version of Capistrano), which I think can handle multiple servers easily enough by specifying an array of :domain addresses...but how can should this be handled when the number of web servers can vary?

    Read the article

  • which is best smart automatic file replication solution for cloud storage based systems.

    - by TORr0t
    I am looking for a solution for a project i am working on. We are developing a websystem where people can upload their files and other people can download it. (similar to rapidshare.com model) Problem is, some files can be demanded much more than other files. The scenerio is like: I have uploaded my birthday video and shared it with all of my friend, I have uploaded it to myproject.com and it was stored in one of the cluster which has 100mbit connection. Problem is, once all of my friends want to download the file, they cant download it since the bottleneck here is 100mbit which is 15MB per second, but i got 1000 friends and they can only download 15KB per second. I am not taking into account that the hdd is serving same files. My network infrastrucre is as follows: 1 gbit server(client) and connected to 4 Nodes of storage servers that have 100mbit connection. 1gbit server can handle the 1000 users traffic if one of storage node can stream more than 15MB per second to my 1gbit (client) server and visitor will stream directly from client server instead of storage nodes. I can do it by replicating the file into 2 nodes. But i dont want to replicate all files uploadded to my network since it is costing much more. So i need a cloud based system, which will push the files into replicated nodes automatically when demanded to those files are high, and when the demand is low, they will delete from other nodes and it will stay in only 1 node. I have looked to gluster and asked in their irc channel that, gluster cant do such a thing. It is only able to replicate all the files or none of the files. But i need it the cluster software to do it automatically. Any solutions ? (instead of recommending me amazon s3) S

    Read the article

  • Stark Expo Needs You

    - by [email protected]
    Train to Become a Master Cloud Operative Can't wait until September to get your Oracle fix? Then come visit us at the Stark Expo now. Marvel Entertainment has turned itself into one of the hottest media companies of the digital age, and at the heart of Marvel's growth and transformation is Oracle technology. Now, this successful collaboration finds its way to the big screen, as Oracle joins forces with Marvel to launch a special showcase Website and movie trailer for the upcoming Iron Man 2. In Iron Man 2, Oracle is a proud sponsor of Stark Expo, a world-class tradeshow that depends on a cloud computing architecture to ensure that systems are free from overload. Starting today, visitors to the showcase Website are invited to become Master Cloud Operatives and keep Stark Expo up and running. Complete your training, test your troubleshooting skills in the Oracle Pavilion, and qualify to receive a free movie poster.

    Read the article

  • Get Ready...Oracle CloudWorld is Coming to a City Near You in 2013

    - by Gene Eun
    Is your organization considering the cloud for deploying enterprise applications? Are mobile and social part of your cloud strategy? If you answered YES to either question, then you should plan to join us at an Oracle CloudWorld event, coming to a city near you in 2013. If you attend, you'll get an opportunity to learn firsthand about Oracle Cloud, talk to product experts, see live demos, and network with other industry professionals. By the way, did I mention that Oracle CloudWorld is a FREE event?Whether you're a C-level executive, line of business manager, or hardcore application developer, Oracle CloudWorld will have valuable information for you with keynotes, breakout sessions, demos, and dedicated tracks for: Sales and Marketing Customer Service and Support Finance and Operations Human Resources Application Developers Applications IT Click here to learn more about Oracle CloudWorld, including cities and dates. Hope to see you there!

    Read the article

  • Microsoft launches two new Data Centres for Azure in US to meet growing demand

    - by Gopinath
    In order to meet the growing demand for Windows Azure in US, Microsoft has launched two new data centres in US – East US and West US. With the addition of these two data centres the number of Azure data centres across the globe has grown to 8 and 4 among them are located in US. The two new data centres are providing Computer and Storage resources and few enthusiastic customers already deployed their applications. The other services like SQL Azure and AppFabric will be offered by these data centres in the coming months. The addition of new data centres is a good sign to Microsoft as the customer demand for their Cloud offering is growing. Amazon Web Services is the pioneer in Cloud Computing and they offer wider range of Cloud Services compared to Microsoft. Source: Windows Azure Blog

    Read the article

  • Do you know what is a DevOps Project?

    - by Gopinath
    Yesterday I wrote about OpenStack project, an open source cloud computing stack that lets you build Cloud Computing environments. While reading more on this topic I stumbled about a new type of projects called DevOps projects.  OpenStack is all set to become the first DevOps project, reports Forbes …the way OpenStack is applying the open source model to creating cloud infrastructure, the open source model is on the verge of being extended so that the collaboration and design process will include software, hardware, and networking in the data center as well as operational processes. In modern development, the idea of designing software, data center, and operations using one integrated team is called DevOps.

    Read the article

  • How can I provide secure web content to mobile devices that can't access an intranet?

    - by evanmcd
    I'm working with a client on development web content for their intranet. We want users to be able to access a version of the content on their mobile devices, but most of them don't have the VPN capability to get on to their intranet. I'm wondering if anyone has had experience with this and can recommend a solution. One other thing to consider is that the content is not mission critically secure. If someone outside the company gained access to it, it would not represent a major issue, only a minor annoyance. Thanks for any advice.

    Read the article

  • Google Chrome custom search engine for secure Wikipedia

    - by gdejohn
    I have this custom search engine set up in Google Chrome: https://encrypted.google.com/search?q=site%3Aen.wikipedia.org+%s&btnI=745 It searches Google for site:en.wikipedia.org {query}, and the btnI=745 is for I'm Feeling Lucky, so it automatically redirects to the first result. I like this better than using Wikipedia's search function directly because it gives me very effective approximate string matching, so I can misspell my search, or leave a word out, or just search for some keywords, and I still get what I'm looking for right away. What I'd like is for it to use Wikipedia's secure gateway: https://secure.wikimedia.org/wikipedia/en/wiki/ It's easy enough to set up a custom search engine that uses the secure version of Wikipedia's search function directly, but I can't figure out how to correctly incorporate it into my version going through Google. Nothing I've tried works.

    Read the article

  • Add "secure" in cookie by httpd server

    - by Abhishek
    How do I have to configure my httpd server to add "Secure" in the cookies? I tried the one in the below link, http://blog.modsecurity.org/2008/12/fixing-both-missing-httponly-and-secure-cookie-flags.html but this did not seem to be working. I inspected the cookie via firebug and found that the cookies have "HttpOnly" but not "Secure". I double checked the configurations and they the same as mentioned in the link. I also noticed that the server response time goes bit high when doing it by mod_security. Is there a better way to do it? Any ideas or pointers to configurations would be helpful

    Read the article

  • Is visiting HTTPS websites on a public hotspot secure?

    - by Calmarius
    It's often said that HTTPS SSL/TLS connections are encrypted and said to be secure because the communication between the server and me is encrypted (also provides server authentication) so if someone sniffs my packets, they will need zillions of years to decrypt if using brute force in theory. Let's assume I'm on a public wifi and there is a malicious user on the same wifi who sniffs every packet. Now let's assume I'm trying to access my gmail account using this wifi. My browser does a SSL/TLS handshake with the server and gets the keys to use for encryption and decryption. If that malicious user sniffed all my incoming and outgoing packets. Can he calculate the same keys and read my encrypted traffic too or even send encrypted messages to the server in my name?

    Read the article

  • File/folder Write/Delete wise, is my server secure?

    - by acidzombie24
    I wanted to know if someone got access to my server by using a nonroot account, how much damage can he do? After i su someuser I used this command to find all files and folders that are writeable. find / -writable >> list.txt Here is the result. Its most /dev/something and /proc/something and these /var/lock /var/run/mysqld/mysqld.sock /var/tmp /var/lib/php5 Is my system secure? /var/tmp makes sense but i am unsure why this user has write access to those folders. Should i change them? stat /var/lib/php5 gives me 1733 which is odd. Why write access? why no read? is this some kind of weird use of a temp file?

    Read the article

  • Is it necessary to change the default users and groups in VMware esxi 4.0 in order to have a secure

    - by Teevus
    By default esxi creates a number of users and groups including: daemon nfsnobody root nobody vimuser dcui How secure is this default security setup? Besides changing the root password, is it advisable to modify the default users and groups? E.g. does esxi use default passwords for the accounts or anything else that could be exploited by malicious users? My scenario is very basic and I don't require any custom users or groups as only sysadmins will ever need to administer the virtual infrastructure, and they can do so using the root account. Thanks

    Read the article

  • What is the most secure way to set up a mysql user for Wordpress?

    - by Sinthia V
    I am setting up Subdomain based MU on my domain.Everything is hosted by me running on one CentOS/Webmin VPS. Will I be better off setting the MySQL user's domain as localhost, 127.0.0.1 or with a wildcard %.mydomain.com? Which is more secure? Is localhost === 127.0.0.1? If not what is the difference? Also, what is my domain from MySQL's or Wordpress' pov when I am connected by ssh terminal? How about When I connect by Webmin or Usermin? Does MySQL see me as Webmin or my Unix user?

    Read the article

  • Call for Papers SOA &amp; Cloud Symposium by Thomas Erl

    - by Jürgen Kress
    3rd International SOA Symposium + 2nd International Cloud Symposium • Call for Presentations Berliner Congress Center, Alexanderstrase 11, 10178 Berlin, Germany (October 5-6, 2010) The International SOA and Cloud Symposium brings together lessons learned and emerging topics from SOA and Cloud projects, practitioners and experts. Please visit the Berlin & The Venue page for a map and more information. The two-day conference agenda will be organized into the following primary tracks: •  Track 1 SOA Architecture & Design •  Track 2 SOA Governance •  Track 3 Business of SOA •  Track 4 BPM, BPMN and Service-Orientation •  Track 5 Modeling from Services to the Enterprise •  Track 6 Real World SOA Case Studies •  Track 7 Real World Cloud Computing Case Studies •  Track 8 Cloud Computing Architecture, Standards & Technologies •  Track 9 REST and Service-Orientation in Practice •  Track 10 SOA Patterns & Practices •  Track 11 Modern ESB and Middleware •  Track 12 Semantic Web •  Track 13 SOA & BPM •  Track 14 Business of Cloud Computing •  Track 15 Cloud Computing Governance, Policies & Security   Presentation Submissions All submissions must be received no later than June 30, 2010. An overview of the tracks can be found here. Wiki with Additional Call for Papers: http://wiki.oracle.com/page/SOA+Call+for+Papers   Technorati Tags: soa,cloud,thomas erl,soasymposium,call for papers

    Read the article

  • Two Virtualization Webinars This Week

    - by chris.kawalek(at)oracle.com
    If you're interested in virtualization, be sure to catch our two free webinars this week. You'll hear directly from Oracle technologists and can ask questions in a live Q&A. Deploying Oracle VM Templates for Oracle E-Business Suite and Oracle PeopleSoft Enterprise Applications Tuesday, Feb 15, 2011 9AM Pacific Time Register Now Is your company trying to manage costs; meet or beat service level agreements and get employees up and running quickly on business-critical applications like Oracle E-Business Suite and Oracle PeopleSoft Enterprise Applications? The fastest way to get the benefits of these applications deployed in your organization is with Oracle VM Templates. Cut application deployment time from weeks to just hours or days. Attend this session for the technical details of how your IT department can deliver rapid software deployment and eliminate installation and configuration costs by providing pre-installed and pre-configured software images. Increasing Desktop Security for the Public Sector with Oracle Desktop Virtualization Thursday, Feb 17, 2011 9AM Pacific Time Register Now Security of data as it moves across desktop devices is a concern for all industries. But organizations such as law enforcement, local, state, and federal government and others have higher security ne! eds than most. A virtual desktop model, where no data is ever stored on the local device, is an ideal architecture for these organizations to deploy. Oracle's comprehensive portfolio of desktop virtualization solutions, from thin client devices, to sever side management and desktop hosting software, provide a complete solution for this ever-increasing problem.

    Read the article

  • Cloud computing over Client-server: differences, cons and pros ?

    - by Vimvq1987
    As far as I know, Cloud computing might be a evolution in software architect, and it will replace some current architectures, such as client-server. These two architecture seem to share similarities for me (I know very little about both), but I don't know the differences between them. What are the cons and pros of cloud computing over client-server architecture? Thank you so much.

    Read the article

  • Personal cloud storage options... HELP!

    - by rhaddan
    I'm looking for some personal cloud storage options. My biggest concern about moving to a hosted storage solution is the long-term viability of the provider. Has anyone used a cloud service that you're crazy about? I'm a Mac user, so I need to have something that will work on the Mac OS and ideally the iPhone as well.

    Read the article

  • Security Trimmed Cross Site Collection Navigation

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). This article will serve as documentation of a fully functional codeplex project that I just created. This project will give you a WebPart that will give you security trimmed navigation across site collections. The first question is, why create such a project? In every single SharePoint project you will do, one question you will always be faced with is, what should the boundaries of sites be, and what should the boundaries of site collections be? There is no good or bad answer to this, because it really really depends on your needs. There are some factors in play here. Site Collections will allow you to scale, as a Site collection is the smallest entity you can put inside a content database Site collections will allow you to offer different levels of SLAs, because you put a site collection on a separate content database, and put that database on a separate server. Site collections are a security boundary – and they can be moved around at will without affecting other site collections. Site collections are also a branding boundary. They are also a feature deployment boundary, so you can have two site collections on the same web application with completely different nature of services. But site collections break navigation, i.e. a site collection at “/”, and a site collection at “/sites/mySiteCollection”, are completely independent of each other. If you have access to both, the navigation of / won’t show you a link to /sites/mySiteCollection. Some people refer to this as a huge issue in SharePoint. Luckily, some workarounds exist. A long time ago, I had blogged about “Implementing Consistent Navigation across Site Collections”. That approach was a no-code solution, it worked – it gave you a consistent navigation across site collections. But, it didn’t work in a security trimmed fashion! i.e., if I don’t have access to Site Collection ‘X’, it would still show me a link to ‘X’. Well this project gets around that issue. Simply deploy this project, and it’ll give you a WebPart. You can use that WebPart as either a webpart or as a server control dropped via SharePoint designer, and it will give you Security Trimmed Cross Site Collection Navigation. The code has been written for SP2010, but it will work in SP2007 with the help of http://spwcfsupport.codeplex.com . What do I need to do to make it work? I’m glad you asked! Simple! Deploy the .wsp (which you can download here). This will give you a site collection feature called “Winsmarts Cross Site Collection Navigation” as shown below. Go ahead and activate it, and this will give you a WebPart called “Winsmarts Navigation Web Part” as shown below: Just drop this WebPart on your page, and it will show you all site collections that the currently logged in user has access to. Really it’s that easy! This is shown as below - In the above example, I have two site collections that I created at /sites/SiteCollection1 and /sites/SiteCollection2. The navigation shows the titles. You see some extraneous crap as well, you might want to clean that – I’ll talk about that in a minute. What? You’re running into problems? If the problem you’re running into is that you are prompted to login three times, and then it shows a blank webpart that says “Loading your applications ..” and then craps out!, then most probably you’re using a different authentication scheme. Behind the scenes I use a custom WCF service to perform this job. OOTB, I’ve set it to work with NTLM, but if you need to make it work alternate authentications such as forms based auth, or client side certs, you will need to edit the %14%\ISAPI\Winsmarts.CrossSCNav\web.config file, specifically, this section - 1: <bindings> 2: <webHttpBinding> 3: <binding name="customWebHttpBinding"> 4: <security mode="TransportCredentialOnly"> 5: <transport clientCredentialType="Ntlm"/> 6: </security> 7: </binding> 8: </webHttpBinding> 9: </bindings> For Kerberos, change the “clientCredentialType” to “Windows” For Forms auth, remove that transport line For client certs – well that’s a bit more involved, but it’s just web.config changes – hit a good book on WCF or hire me for a billion trillion $. But fair warning, I might be too busy to help immediately. If you’re running into a different problem, please leave a comment below, but the code is pretty rock solid, so .. hmm .. check what you’re doing! BTW, I don’t  make any guarantee/warranty on this – if this code makes you sterile, unpopular, bad hairstyle, anything else, that is your problem! But, there are some known issues - I wrote this as a concept – you can easily extend it to be more flexible. Example, hierarchical nav, or, horizontal nav, jazzy effects with jquery or silverlight– all those are possible very very easily. This webpart is not smart enough to co-exist with another instance of itself on the same page. I can easily extend it to do so, which I will do in my spare(!?) time! Okay good! But that’s not all! As you can see, just dropping the WebPart may show you many extraneous site collections, or maybe you want to restrict which site collections are shown, or exclude a certain site collection to be shown from the navigation. To support that, I created a property on the WebPart called “UrlMatchPattern”, which is a regex expression you specify to trim the results :). So, just edit the WebPart, and specify a string property of “http://sp2010/sites/” as shown below. Note that you can put in whatever regex expression you want! So go crazy, I don’t care! And this gives you a cleaner look.   w00t! Enjoy! Comment on the article ....

    Read the article

< Previous Page | 73 74 75 76 77 78 79 80 81 82 83 84  | Next Page >