Search Results

Search found 7726 results on 310 pages for 'twitter integration'.

Page 47/310 | < Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >

  • Recommended programming language for linux server management and web ui integration

    - by Brendan Martens
    I am interested in making an in house web ui to ease some of the management tasks I face with administrating many servers; think Canonical's Landscape. This means doing things like, applying package updates simultaneously across servers, perhaps installing a custom .deb (I use ubuntu/debian.) Reviewing server logs, executing custom scripts, viewing status information for all my servers. I hope to be able to reuse existing command line tools instead of rewriting the exact same operations in a different language myself. I really want to develop something that allows me to continue managing on the ssh level but offers the power of a web interface for easily applying the same infrastructure wide changes. They should not be mutually exclusive. What are some recommended programming languages to use for doing this kind of development and tying it into a web ui? Why do you recommend the language(s) you do? I am not an experienced programmer, but view this as an opportunity to scratch some of my own itches as well as become a better programmer. I do not care specifically if one language is harder than another, but am more interested in picking the best tools for the job from the beginning. Feel free to recommend any existing projects that already integrate management of many systems into a single cohesive web ui, except Landscape (not free,) Ebox (ebox control center not free) and webmin (I don't like it, feels clunky and does not integrate well with the "debian way" of maintaining a server, imo. Also, only manages one system.) Thanks for any ideas! Update: I am not looking to reinvent the wheel of systems management, I just want to "glue" many preexisting and excellent tools together where possible and appropriate; this is why I wonder about what languages can interact well with pre-existing command line tools, while making them manageable with a web ui.

    Read the article

  • WebsitePanel and BIND integration issues

    - by Fariborz Navidan
    I have setup a new Windows server with WebsitePanel stand alone server installed om. I have configured everything. When create a new space in WSP, the only information saved in the DNS zone file is the SOA record and no A record added. Each service has its own DNS zone template records but they are not reflected in domain's DNS zone. Also when delete the space, DNS zone is not deleted. I am using WebsitePanel 1.2.1

    Read the article

  • Ubuntu upstart sysvinit integration

    - by Eli
    Hi, I use ubuntu lucid server and have a sysvinit start up script thats been working for me for years. I only started using lucid recently, and found out about upstart when my script stopped working. The script depends on mysql server, which unlike in older versions, now uses upstart. Migrating my script to upstart isn't an alternative for now. I would like to continue using my sysvinit, but how do I make sure it starts after mysql? Rgds, Eli

    Read the article

  • OSX 10.6 integration into NIS/netgroup/automount infrastructure

    - by mdpc
    I have an existing infrastructure where accounts are maintained under NIS (yp) with no local unix accounts. Also, all the standard maps including hosts, mail aliases, netgroups, etc...are maintained in this form. Extensive use of the UNIX/Linux automounter with items scattered over the network on NFS servers. There are NO ACLs on any local or shared files. All mail needs to use basically the nullclient sendmail configuration feeding into a different system. I now have a requirement to integrate an Apple OSX 10.6 system into this environment and make it run seamlessly. My initial reading and second-hand information seems to indicate that this may not be possible on the native OSX 10.6 system. I'm concerned. Any ideas as to how to accomplish this task and make everybody happy? Thanks PS: I have never used an Apple OSX system.

    Read the article

  • Active directory integration not working properly with winbind and samba

    - by tubaguy50035
    I'm trying to get my linux box to use active directory authentication. I believe I have almost everything setup correctly. I'm able to issue wbinfo -g and wbinfo -u and see all the groups and users respectively. Brief intro to my setup: The username I use on my linux box to do admin things is nick. My active directory username is nwalke. They have two different passwords. I am able to log in to the box with nick and that user's password and I'm also able to login as nwalke with nwalke's password. The curious bit: Upon creating the active directory user's home directory, I run a script that requires root access. This is to setup some system wide things like a samba share for them. When I log in as nwalke, I enter my nwalke password and it succeeds. I'm then greeted with [sudo] password for nick:. If I enter my nwalke password here, it says Sorry, try again.. If I enter nick's password, it says Sorry, user nick is not allowed to execute scriptname as root. If I do groups as nwalke, I see that magically my user has been given the group nick. Now, I accidentally thought that nick had a UID of 100, not 1000. So originally in my smb.conf I had idmap uid 1000-10000. The only thing I can think of, is that I logged in with nwalke while that was still set and now I'm just being presented with a UID of 1000 forcing linux to think I'm nick. I'm not really sure where to go from here. Like I said, I'm fairly certain active directory is communicating with my server properly, but something must not be mapped right on the linux side. Any thoughts? Here is my smb.conf: [global] security = ads netbios name = hostname realm = COMPANY.COM password server = adshost.company.com workgroup = COMPANY idmap uid = 10000-90000 idmap gid = 10000-90000 winbind separator = + winbind enum users = no winbind enum groups = no winbind use default domain = yes template homedir = /home/%D/%U template shell = /bin/bash client use spnego = yes domain master = no load printers = no printing = bsd printcap name = /dev/null disable spoolss = yes Let me know if more information about something is required.

    Read the article

  • OSX 10.6 integration into NIS/netgroup/automount infrastructure

    - by mdpc
    I have an existing infrastructure where accounts are maintained under NIS (yp) with no local unix accounts. Also, all the standard maps including hosts, mail aliases, netgroups, etc...are maintained in this form. Extensive use of the UNIX/Linux automounter with items scattered over the network on NFS servers. There are NO ACLs on any local or shared files. All mail needs to use basically the nullclient sendmail configuration feeding into a different system. I now have a requirement to integrate an Apple OSX 10.6 system into this environment and make it run seamlessly. My initial reading and second-hand information seems to indicate that this may not be possible on the native OSX 10.6 system. I'm concerned. Any ideas as to how to accomplish this task and make everybody happy? Thanks PS: I have never used an Apple OSX system.

    Read the article

  • Integration of SharePoint 2010 with TFS2010

    - by Kabir Rao
    We have performed following steps as of now- Install TFS2010 10.0.30319.1 (RTM) on Windows Server 2008 R2 Enterprise(app tier) SQL 2008 SP1 with Cumulative update 2 on Windows Server 2008 R2 Enterprise(data tier) Reporting Service is installed on app tier. After this installation worked fine we installed SharePoint 2010 on app tier. After installation we followed http://blogs.msdn.com/b/team_foundation/archive/2010/03/06/configuring-sharepoint-server-2010-beta-for-dashboard-compatibility-with-tfs-2010-beta2-rc.aspx for configuration. We are not able to perform the last step described in the link as following error occured- TF249063: The following Web service is not available: http://apptier:31254/_vti_bin/TeamFoundationIntegrationService.asmx. This Web service is used for the Team Foundation Server Extensions for SharePoint Products. The underlying error is: The remote server returned an error: (404) Not Found.. Verify that the following URL points to a valid SharePoint Web application and that the application is available: http://apptier:31254. If the URL is correct and the Web application is operating normally, verify that a firewall is not blocking access to the Web application. We have also noticed that Document Folder in Team project also have red x. Please help. Thanks upfront.

    Read the article

  • Graphite Integration breaks ganglia/gmetad?

    - by Falk Stern
    I'm trying to forward metrics from gmetad to graphite/carbon. After configuring carbon_server and ganglia_prefix in gmetad.conf gmetad starts losing metrics. gmetad Version is 3.3.5, carbon/whisper/graphite-web is 0.9.8. There is no I/O bottleneck on the system and no CPU bottleneck (HP DL385G7 with 2 SSDs in RAID0), I even configured another gmetad on a remote host to send metrics to graphite/carbon, which also broke down. Does anyone else experience this?

    Read the article

  • Cisco Spam Blocker, Iron Port, Lotus Domino, Integration Help

    - by NickToyota
    Hi serverfault universe, I work for a medium sized (roughly 200 user) company. We are attempting to intagrate our new Cisco Spam Video Blocker (ironport) device into our network so that it acts as an incoming filter then passes it off to our Lotus domino mail server. And also vise versa. The way our network is setup currently has an mx record pointing to our Domino mail SMTP incoming server which is currently setup to be an inbound gateway and filter (using symantec domino mail software). We want to replace the inbound gateway with the ironport. Our company has also invested in a pool of external IP addresses which I believe has been currently assigned to our web, email, servers. What would the proper course of action be to successfully integrate the device be? Mx record change? Replace the domino gateway completely with the ironport? We attempted to set the ironport device to the external IP of what our mx record is pointing to without much success. Any help on proper setup would be greatly appreciated.

    Read the article

  • tacacs integration with database

    - by chingupt
    We are setting up TACACS+ in our network which is a mix of Cisco AP's and other brands. However we have a centralized managemnet system which allows our customers to centrally configure services. Hence we would like to setup a tacacs+ server integrated with some database. Can this be done? I found the following package at www.shrubbery.net/tac_plus/ but it does not have the necessary plugins for database. Please let me know how to go about this. TIA Sachin

    Read the article

  • What would be the ideal SAS package to integrate with a Enterprise scale web application developed using Java

    - by Rajesh
    SAS-Java Connectivity and Integration for a enterprise class web application - I am trying to narrow down the available approaches to connect to SAS frm Java. I am asusming the following , please correct my assumption if you think is not correct SAS ACCESS (ODBC, JDBC) /SAS Share Net- Query SAS Datasets using a JDBC Model SAS/Intr Net (For Connectivity with Java and build small scale applications) SAS Integration Technology (For Connectivity with Java and build Distributed Java Applications) Now the scenario is i need to build a enterprise class web application using Java/J2EE and ensure this application talks to SAS for Querying SAS Datasets Execute SAS Programs and generate Reports I am looking for a cost effective and robust solution which will work in a Multi user environment.

    Read the article

  • MySQL ADO.NET Connector & MSSQL Integration Services

    - by user1114330
    Here I am, day three... attempting to sync a data view on a Windows Vista box (64 bit) running MSSQL 2012 and Visual Studio 2010. Sanity is slipping and hunger for progress fills my attention. I went through hell trying to get the MySQL ODBC drivers to get the job but to no avail...everyone seems to be lost and all the threads I can find are solutions that do not work for me. The problem: System DSN's not being seen by SSIS. SSIS DSN Not Showing as ODBC Data Source I make the decision to try out the ADO.NET connector...and to my surprise it is actually in the selection list in data sources in SSIS. So I take off running to create a Data Flow Task, create an ADO.NET Source (a local MSSQL DB)...all is good as usual. Then I move swiftly to creating a ADO.NET Destination, enter my credentials...wow, I am selecting a database finally on my linux server! Happy thinking that I finally have figured a way to get the job done. Then I move to mappings...nope, something is wrong...I am getting an error that hurts my eyes: Pipeline component has returned HRESULT error code 0xC0208457 from a a method call. Error at Data Flow Task [ADO NET Destination [81]]: Failed to get properties of external columns. The table name you entered may not exist or you do not have SELECT permission on the table object and an alternative attempt to get column properties through connection has failed. Detailed error messages are" You have an error in your SQL syntax check the manual that corresponds to your MySQL server version for the right syntax to use near "database".tablename" at line 1. The descriptor files on path C:\Program Files (x86)\Microsoft SQL Server\110\DTS\ProviderDescriptors\ does not contain schema information for connection of type MySQL.Data.MySqlClient.MySqlConnection. So it looks like it can't the information and therefore I cannot map the tables properly. Any ideas on this would be ultra helpful...thanks in advance to All!

    Read the article

  • WordPress and EverNote integration, any existing solutions?

    - by JXITC
    Actually the tag and category system in WordPress and EverNote are very similar, almost exactly the same. We can create a notebook/notebook group in EverNote, its counterpart in Wordpress is Category/Sub Category We can mutiple tag any note in EverNote, similarly we can tag any post in Wordpress as well! Since the two are so similar to each other in categorizing the article, I am wondering whether there is a existing solution to automatically convert between the two. Like one click to post my EverNote note to WordPress under the same category and same tag. Or pull down my WordPress article to my EverNote account under the same category and same tag. Preferably it can also handle image\video uploading issues as well. :) I saw a similar questions asked years ago: Posting from evernote to wordpress So I was wondering is there any new update for this idea right now?

    Read the article

  • ID Badge Access System for Building with Active Directory Integration [closed]

    - by Alex
    I hope this is the right place for this question. So, we're looking into setting up a building access that uses badges or cards of some kind. I wanted to ask the users on here if they've had to do such setups and/or if they have recommendations? Is there maybe a system that integrates with Active Directory? I know one of the things our managers want to do is to be able to run reports on when people are entering the buildings. I'd appreciate any suggestions and thanks in advance!

    Read the article

  • Autofac WCF integration + sessions

    - by Michael Sagalovich
    I am having an ASP.NET MVC 3 application that collaborates with a WCF service, which is hosted using Autofac host factory. Here are some code samples: .svc file: <%@ ServiceHost Language="C#" Debug="true" Service="MyNamespace.IMyContract, MyAssembly" Factory="Autofac.Integration.Wcf.AutofacServiceHostFactory, Autofac.Integration.Wcf" %> Global.asax of the WCF service project: protected void Application_Start(object sender, EventArgs e) { ContainerBuilder builder = new ContainerBuilder(); //Here I perform all registrations, including implementation of IMyContract AutofacServiceHostFactory.Container = builder.Build(); } Client proxy class constructor (MVC side): ContainerBuilder builder = new ContainerBuilder(); builder.Register(c => new ChannelFactory<IMyContract>( new BasicHttpBinding(), new EndpointAddress(Settings.Default.Url_MyService))) .SingleInstance(); builder.Register(c => c.Resolve<ChannelFactory<IMyContract>>().CreateChannel()) .UseWcfSafeRelease(); _container = builder.Build(); This works fine until I want WCF service to allow or require sessions ([ServiceContract(SessionMode = SessionMode.Allowed)], or [ServiceContract(SessionMode = SessionMode.Required)]) and to share one session with the MVC side. I changed the binding to WSHttpBinding on the MVC side, but I am having different exceptions depending on how I tune it. I also tried changing AutofacServiceHostFactory to AutofacWebServiceHostFactory, with no result. I am not using config file as I am mainly experimenting, not developing real-life application, but I need to study the case. But if you think I can achieve what I need only with config files, then OK, I'll use them. I will provide exception details for each combination of settings if required, I'm omitting them not to make the post too large. Any ideas on what I can do?

    Read the article

  • Integration test failing through NUnit Gui/Console, but passes through TestDriven in IDE

    - by Cliff
    I am using NHibernate against an Oracle database with the NHibernate.Driver.OracleDataClientDriver driver class. I have an integration test that pulls back expected data properly when executed through the IDE using TestDriven.net. However, when I run the unit test through the NUnit GUI or Console, NHibernate throws an exception saying it cannot find the Oracle.DataAccess assembly. Obviously, this prevents me from running my integration tests as part of my CI process. NHibernate.HibernateException : The IDbCommand and IDbConnection implementation in the assembly Oracle.DataAccess could not be found. Ensure that the assembly Oracle.DataAccess is located in the application directory or in the Global Assembly Cache. If the assembly is in the GAC, use element in the application configuration file to specify the full name of the assembly.* I have tried making the assembly available in two ways, by copying it into the bin\debug folder and by adding the element in the config file. Again, both methods work when executing through TestDriven in the IDE. Neither work when executing through NUnit GUI/Console. The NUnit Gui log displays the following message. 21:42:26,377 ERROR [TestRunnerThread] ReflectHelper [(null)]- Could not load type Oracle.DataAccess.Client.OracleConnection, Oracle.DataAccess. System.BadImageFormatException: Could not load file or assembly 'Oracle.DataAccess, Version=2.111.7.20, Culture=neutral, PublicKeyToken=89b483f429c47342' or one of its dependencies. An attempt was made to load a program with an incorrect format. File name: 'Oracle.DataAccess, Version=2.111.7.20, Culture=neutral, PublicKeyToken=89b483f429c47342' --- System.BadImageFormatException: Could not load file or assembly 'Oracle.DataAccess' or one of its dependencies. An attempt was made to load a program with an incorrect format. File name: 'Oracle.DataAccess' I am running NUnit 2.4.8, TestDriven.net 2.24 and VS2008sp1 on Windows 7 64bit. Oracle Data Provider v2.111.7.20, NHibernate v2.1.0.4. Has anyone run into this issue, better yet, fixed it?

    Read the article

  • Returned JSON from Twitter and displaying tweets using FlexSlider

    - by Trey Copeland
    After sending a request to the Twitter API using geocode, I'm getting back a json response with a list of tweets. I then that into a php array using json_decode() and use a foreach loop to output what I need. I'm using flex slider to show the tweets in a vertical fashion after wrapping them in a list. So what I want is for it to only show 10 tweets at a time and scroll through them infinitely like an escalator. Here's my loop to output the tweets: foreach ($tweets["results"] as $result) { $str = preg_replace('/[^\00-\255]+/u', '', $result["text"]); echo '<ul class="slides">'; echo '<li><a href="http://twitter.com/' . $result["from_user"] . '"><img src=' . $result["profile_image_url"] . '></a>' . $str . '</li><br /><br />'; echo '</ul>'; } My jQuery looks like this as of right now as I'm trying to play around with things: $(window).load(function() { $('.flexslider').flexslider({ slideDirection: "vertical", start: function(slider) { //$('.flexslider .slides > li gt(10)').hide(); }, after: function(slider) { // current.sl } }); }); Non-Working demo here - http://macklabmedia.com/tweet/

    Read the article

  • Enterprise integration of disparate systems

    - by Chris Latta
    We're about to embark on a fairly large integration effort to kill off a bunch of Access and Sql Server databases and get everything into one coherent enterprise system. There are also a number of other systems (accounting, CRM, payroll, MS Exchange) that hold critical data that we need to integrate (use for data validation in other systems), report on and otherwise expose. It is likely that some of these systems will change in the next few years, so we need to isolate our systems to be ready for change. Ideally we would be able to expose our forms in a consistent manner across as many of our our systems as possible without having to re-develop them for each system. We are currently targeting SharePoint (2007 and soon 2010), Office (2007 and soon 2010 - Word, Excel, PowerPoint and Outlook), Reporting Services, .Net console applications, .Net Windows applications, shell extensions, and with the possibility of exposing some functionality on mobile devices (BlackBerries currently, maybe iPhones later) and via our website. We're moving development to Visual Studio 2010 (from 2005) ahead of migrating to SharePoint 2010 and Office 2010. Given that most of our development is presently targeted to the .Net framework (mostly in C#) it seems logical to stick with this unless there is some compelling reason to switch frameworks/platform for some aspects. We're thinking of your standard Database-Data Integration layer-Business Objects Layer-Web Services (or REST) layer-Client Application plus doing our own client application with WPF (or something else?) forms that can also be exposed in the MS systems (SharePoint, Office, Windows). So, we don't want much, just everything :) Basically we need to isolate ourselves from database and systems changes, create an API that can be used throughout our systems and then make this functionality available in our client applications. I'm very keen to get pointers from anyone who has tips on how to pull this off. Should we look at the Enterprise Library as a place to start? Is REST with ASP.Net MVC2 a better solution than Web Services for a system like this? Will WPF deliver forms re-use or is there something better?

    Read the article

  • TouchXML to read in twitter feed for iphone app

    - by Fiona
    Hello there, So I've managed to get the feed from twitter and am attempting to parse it... I only require the following fields from the feed: name, description, time_zone and created_at I am successfully pulling out name and description.. however time_zone and created_at always are nil... The following is the code... Anyone see why this might not be working? -(void) friends_timeline_callback:(NSData *)data{ NSString *string = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding]; NSLog(@"Data from twitter: %@", string); NSMutableArray *res = [[NSMutableArray alloc] init]; CXMLDocument *doc = [[[CXMLDocument alloc] initWithData:data options:0 error:nil] autorelease]; NSArray *nodes = nil; //! searching for item nodes nodes = [doc nodesForXPath:@"/statuses/status/user" error:nil]; for (CXMLElement *node in nodes) { int counter; Contact *contact = [[Contact alloc] init]; for (counter = 0; counter < [node childCount]; counter++) { //pulling out name and description only for the minute!!! if ([[[node childAtIndex:counter] name] isEqual:@"name"]){ contact.name = [[node childAtIndex:counter] stringValue]; }else if ([[[node childAtIndex:counter] name] isEqual:@"description"]) { // common procedure: dictionary with keys/values from XML node if ([[node childAtIndex:counter] stringValue] == NULL){ contact.nextAction = @"No description"; }else{ contact.nextAction = [[node childAtIndex:counter] stringValue]; } }else if ([[[node childAtIndex:counter] name] isEqual:@"created_at"]){ contact.date == [[node childAtIndex:counter] stringValue]; }else if([[[node childAtIndex:counter] name] isEqual:@"time_zone"]){ contact.status == [[node childAtIndex:counter] stringValue]; [res addObject:contact]; [contact release]; } } } self.contactsArray = res; [res release]; [self.tableView reloadData]; } Thanks in advance for your help!! Fiona

    Read the article

  • What is the best way to mock a 3rd party object in ruby?

    - by spinlock
    I'm writing a test app using the twitter gem and I'd like to write an integration test but I can't figure out how to mock the objects in the Twitter namespace. Here's the function that I want to test: def build_twitter(omniauth) Twitter.configure do |config| config.consumer_key = TWITTER_KEY config.consumer_secret = TWITTER_SECRET config.oauth_token = omniauth['credentials']['token'] config.oauth_token_secret = omniauth['credentials']['secret'] end client = Twitter::Client.new user = client.current_user self.name = user.name end and here's the rspec test that I'm trying to write: feature 'testing oauth' do before(:each) do @twitter = double("Twitter") @twitter.stub!(:configure).and_return true @client = double("Twitter::Client") @client.stub!(:current_user).and_return(@user) @user = double("Twitter::User") @user.stub!(:name).and_return("Tester") end scenario 'twitter' do visit root_path login_with_oauth page.should have_content("Pages#home") end end But, I'm getting this error: 1) testing oauth twitter Failure/Error: login_with_oauth Twitter::Error::Unauthorized: GET https://api.twitter.com/1/account/verify_credentials.json: 401: Invalid / expired Token # ./app/models/user.rb:40:in `build_twitter' # ./app/models/user.rb:16:in `build_authentication' # ./app/controllers/authentications_controller.rb:47:in `create' # ./spec/support/integration_spec_helper.rb:3:in `login_with_oauth' # ./spec/integration/twit_test.rb:16:in `block (2 levels) in <top (required)>' The mocks above are using rspec but I'm open to trying mocha too. Any help would be greatly appreciated.

    Read the article

  • Mysql - wondering about scaling a twitter-like application ?

    - by user246114
    Hi, I'm developing an app that is vaguely similar to twitter, in that it allows users to follow one another. I wanted to do this using google app engine, for its scalability promises, but it's proving kind of difficult to get running for a few different reasons. I'd basically like to have a _users table, and a _followers table. Users go into the users table, follower relationships go into _followers. The problem is that each row in the users table will probably have like 100 corresponding records in the _followers table as users start following one another. So the number of rows is going to explode quickly. Using app engine, the volume [shouldn't] be a problem. If I go with mysql, and I do actually start to get some traction, how do I scale this up? Am I going to just end up moving to a distributed database in the end anyway? Should I fight it out with google app engine? I read that Twitter was using mysql, and they've run into this problem, and are now switching to cassandra. Thanks

    Read the article

  • Proxy Authentication in .NET - for external API

    - by n0vic3c0d3r
    I'm developing a twitter messaging utility using Twitter API (twitterizer). But since I'm within a corporate proxy, I'm getting the error '407 Proxy Authentication Required'. Is there any way to authenticate the user before calling the API or use the default proxy settings? P.S Internally the API is using HttpWebRequest.

    Read the article

< Previous Page | 43 44 45 46 47 48 49 50 51 52 53 54  | Next Page >