Databricks supports adding arbitrary files to Repos, e.g. json, csv, etc.
However, it requires a runtime version >= 8.4 to support the files.
To read a file from the Repos, use relative path is Ok, e.g.:
import json
with open('./metadata.json') as f:
data = json.load(f)
print(json.dumps(data, indent=4, sort_keys=True))
If clicking the down arrow next to the file, an option is to copy the 'File Path relative to the root', e.g.
folder name/metadata.json
However this relative path would not work.
If using the option to copy the Full File path, e.g.
/Workspace/Repos/<user name>/<repo name>/folder name/metadata.json
It works.
import json
with open('/Workspace/Repos/<user name>/<repo name>/folder name/metadata.json') as f:
data = json.load(f)
print(json.dumps(data, indent=4, sort_keys=True))
If reading a json file using Spark, then an absolute path is required. But this absolute path is not the Full File Path above.
You can get it from os.getcwd() and a "file:" prefix is required in the url.
import os
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
df = spark.read.format("json").load(f"file:{os.getcwd()}/metadata.json")