Google Cloud - Connect to postgresql database in same project with python -


tl;dr: how connect postgres db gce instance in same "project" without granting ip access 35.185.*

========

hello,

i have simple test script works locally:

print "importing" import pandas  pd import urllib import json sqlalchemy import * sqlalchemy import create_engine sqlalchemy import table, column, integer, string, metadata, foreignkey print "done importing" conn_string = 'postgresql://my_user:my_pass@ip.of.my.db/postgres' print "connecting" engine = create_engine(conn_string, echo=true) print "engine created" engine.connect() print "engine connected"  print "getting data" data=json.loads(urllib.urlopen("http://ip.jsontest.com").read()) df=pd.dataframe([data]) print "data retrieved" df.to_sql('insert_test',engine, index=false, schema='public', chunksize=500,                                   if_exists='append') 

now, works because have explicitly granted access db ip: enter image description here

my output is:

importing done importing connecting engine created 2017-04-14 10:17:38,319 info sqlalchemy.engine.base.engine select version() 2017-04-14 10:17:38,323 info sqlalchemy.engine.base.engine {} 2017-04-14 10:17:38,381 info sqlalchemy.engine.base.engine select current_schema() 2017-04-14 10:17:38,385 info sqlalchemy.engine.base.engine {} 2017-04-14 10:17:38,440 info sqlalchemy.engine.base.engine select cast('test plain returns' varchar(60)) anon_1 2017-04-14 10:17:38,447 info sqlalchemy.engine.base.engine {} 2017-04-14 10:17:38,483 info sqlalchemy.engine.base.engine select cast('test unicode returns' varchar(60)) anon_1 2017-04-14 10:17:38,490 info sqlalchemy.engine.base.engine {} 2017-04-14 10:17:38,568 info sqlalchemy.engine.base.engine show standard_conforming_strings 2017-04-14 10:17:38,575 info sqlalchemy.engine.base.engine {} engine connected getting data data retrieved 2017-04-14 10:17:38,750 info sqlalchemy.engine.base.engine select relname pg_class c join pg_namespace n on n.oid=c.relnamespace n.nspname=%(schema)s , relname=%(name)s 2017-04-14 10:17:38,753 info sqlalchemy.engine.base.engine {'name': u'insert_test', 'schema': u'public'} 2017-04-14 10:17:38,832 info sqlalchemy.engine.base.engine begin (implicit) 2017-04-14 10:17:38,841 info sqlalchemy.engine.base.engine insert public.insert_test (ip) values (%(ip)s) 2017-04-14 10:17:38,848 info sqlalchemy.engine.base.engine {'ip': u'my_local_ip'} 2017-04-14 10:17:38,903 info sqlalchemy.engine.base.engine commit 

when move script gce instance , run with:

$ python hello_db.py 

my output following:

 file "/home/user/anaconda2/lib/python2.7/site-packages/psycopg2-2.7.1-py2.7-linux-x86_64.egg/psycopg2/__init__.py", line 130, in connect     conn = _connect(dsn, connection_factory=connection_factory, **kwasync) sqlalchemy.exc.operationalerror: (psycopg2.operationalerror) not connect server: connection timed out         server running on host "ip.of.my.db" , accepting         tcp/ip connections on port 5432? 

both db , compute engine instance part of same "project".

i can see instance ip 35.185.foo.blah.whatever know ip because ran:

wget "http://ip.jsontest.com" 

given ip of instance dynamic launch many instances distribute tasks , use cloud compute stuff - how grant access of these instances?

granting 35.185.* seems unsafe because if else on gce tries db?

i new , maybe missing - assumed "project" able access of other assets without issue. sure there great reason not case, lost on me!

this link documentation needed read solve problem.


Comments

Popular posts from this blog

php - Permission denied. Laravel linux server -

google bigquery - Delta between query execution time and Java query call to finish -

python - Pandas two dataframes multiplication? -